Beyond Carbon Credits: How CPU Innovation is Combating AI’s Energy Challenges
Press Releases
Sep 25, 2024
Ampere’s Chief Product Officer Jeff Wittich, explains how energy-efficient processors can cut AI’s energy use and carbon footprint
SANTA CLARA, Calif., Sept. 25, 2024 /PRNewswire/ — Artificial intelligence is rapidly expanding its presence in nearly every industry and in our daily lives. A recent report from Bloomberg predicts that by 2032, the generative AI market will grow to $1.3 trillion. While this growth brings immense opportunity, it also comes at a major cost: energy consumption. The International Energy Agency predicts that AI’s electricity usage will more than double from 2022 to 2026, due to the amount of computing resources and the data center footprint that will be needed to power it.
Major companies are beginning to take steps to address this issue. Microsoft, for example, has been actively looking for ways to offset its carbon footprint after reporting a 30% increase in emissions since 2020, primarily from constructing data centers. To mitigate this, they have invested heavily in carbon credits, purchasing “credits” from companies with unused emissions allowances to offset the additional emissions generated by new technologies like AI.
Carbon credits certainly have some short term merit but they are not a complete, long-term solution for addressing the emissions created by AI. This is because they do not actually reduce the overall global power consumption or carbon footprint – they simply save power in one place just to spend it in another. If we want to take real action to reduce the power consumption and emissions AI requires, the tech industry needs to do what it does best: innovate.
Innovating AI Power Efficiency: Tackling the Data Center Capacity Crisis
When it comes to innovating to solve AI’s power challenge, companies need to start at the source of where that power usage is coming from – the processor. The surge in AI, especially AI training, has driven the demand for power-hungry GPUs. In addition, existing server infrastructure, which is commonly more than five years old, is aging and vulnerable.
The combination of these factors has created a data center capacity crisis. Data centers are sitting half empty because it doesn’t take many installed servers for them to hit their power wall. In turn, companies continue to build new data centers to add more compute, offsetting them with things like carbon credits so that they can still meet their sustainability goals. But with data centers currently using around 2% of global energy consumption, data center expansion is no longer a viable solution to the capacity problem.
The new answer is energy-efficient, scalable cloud native processors. Refreshing outdated systems and selecting the optimal processor for each task maximizes performance per rack, increases data center density, and delivers the compute power necessary for AI adoption – all while taking meaningful steps to minimize environmental impact, rather than merely offsetting it.
Exploring the AI Energy and Carbon Challenge
Diving deeper, it’s clear that the rapid expansion of AI has led to a dramatic rise in energy usage, particularly in data centers. Some AI-ready racks consume an excessive 40-60 kilowatts each due to their GPU-intensive hardware.
Given this context, this increasing demand necessitates more direct and tangible measures to significantly reduce the carbon footprint. Relying solely on carbon credits is not sustainable in the long term. It’s evident that more proactive solutions are needed.
The AI industry invested $50 billion in GPUs last year to train advanced models, but critics argue that despite this massive expenditure, the industry’s revenue remains relatively small, raising concerns about potential financial strains. This disparity underscores the need for more efficient and cost-effective approaches in the AI sector.
Making the Move to CPUs: Concrete Steps to Reduce AI’s Carbon Footprint
To make a meaningful impact, companies need to expand beyond carbon credits and adopt a fresh approach. Here are three key steps businesses can take to reduce their AI carbon footprint:
- Refresh Aging Servers to Maximize Performance per Rack
In today’s rapidly evolving technological landscape, the urgency to address inefficiencies in data centers has never been more critical. Average server refresh cycles exceed five years, resulting in increasingly outdated and inefficient infrastructure that is expensive to maintain. Upgrading outdated servers with more energy-efficient models like cloud-native processors is crucial for operators to reduce energy consumption, lower operational costs, and free up space and power budgets for growth and modernization investments. This is especially critical now, as many operators face space and power constraints while recognizing the need to modernize with AI and cloud-native technologies. By modernizing quickly, operators can maximize their performance per existing rack and avoid costly and environmentally taxing data center expansion.
- Right-Size Your Inference Compute for the Task
It’s crucial to match computing resources with specific AI workloads. Despite the GPU hype-cycle of recent years, CPUs are well-suited for many AI inference tasks. Estimates say about 85% of AI operations are inference, not training. Specifically for low-latency and/or cost-sensitive AI inference deployments, CPUs offer a balanced blend of performance, energy efficiency, and cost-effectiveness. By right-sizing your compute infrastructure, you can reduce unnecessary energy consumption and improve operational efficiency. This strategy ensures that resources are allocated effectively, preventing over-provisioning and reducing waste.
- Switch to CPUs vs. GPUs for AI Workloads
While GPUs are celebrated for their high performance in AI training, they are not always necessary for AI inference tasks. CPUs consume significantly less power than top end GPUs and can be a more sustainable and cost-effective option. Switching to CPU-based inference can save on energy costs and reduce carbon emissions significantly. Companies like Oracle have optimized AI performance using CPUs, running AI models like the Llama 3 efficiently without excessive energy consumption. This transition not only lowers operational costs but also aligns with sustainability goals by minimizing environmental impact.
Modern CPUs play a pivotal role in this transformation by offering high performance with significantly lower power requirements. For instance, replacing outdated servers with new energy-efficient CPU models can drastically reduce overall energy usage in data centers. These new systems not only cut down on energy consumption but also enhance the performance and reliability of data center operations. These advanced CPUs are specifically designed to handle AI inference tasks with optimal efficiency, ensuring that even demanding workloads are processed without excessive energy consumption.
Addressing AI’s energy challenges requires more than just purchasing carbon credits. By right-sizing inference compute, switching to CPUs for relevant AI workloads, and refreshing aging servers with cloud native processors, companies can significantly reduce their AI carbon footprint. These strategies not only improve operational efficiency and sustainability but also offer tremendous cost savings.
Contact:
Alexa Korkos
Ampere Computing
press@amperecomputing.com
View original content to download multimedia:https://www.prnewswire.com/news-releases/beyond-carbon-credits-how-cpu-innovation-is-combating-ais-energy-challenges-302257942.html
SOURCE Ampere