How Much is Enough? Fueling AI’s Energy Appetite
James Flanagan
HST401-A
How Much is Enough? Fueling AI’s Energy Appetite
Generative AI
demands significant electricity. Driving a rapid increase in global data center
power consumption. The International Energy agency reports that data center
electricity consumption will rise from 460 terawatt hours in 2022 to anywhere
between 620 and 1,050 terawatt hours by 2026. To put this in perspective,
Russia’s total electricity consumption in 2021 was 996 terawatt hours. Generative
AI is to blame for this rapid increase in data center power consumption.
Google
has been experimenting with AI and features generative AI responses on about
15% of its searches. This AI integration has substantially increased Googles
power consumption. With 9 billion daily Google searches, Google will process
over a billion AI requests. This is just the start of it, OpenAI, Microsoft and
Meta have their own generative AI’s which likely have a similar volume of
requests. This becomes problematic when looking at the power consumption of AI
requests. An article published by Goldman Sachs Research found that one ChatGPT
request uses 2.9 watt-hours of electricity, compared to the 0.3 watt-hours used
for a google search. That’s almost ten times more power for an AI search. AI
requests consume more power than standard searches due to the complex
calculations needed, which are handled by GPUs, computer chips that are
designed for high-speed processing of large datasets.
Over
the past decade, datacenters have needed to expand to keep up with the
increasing usage of the internet. Despite this massive growth, the electricity
consumption of data centers has remained consistent. As the increased internet
and data center usage increased, the efficiency of hardware also increased
which has been able to offset the power demand until recently. The increased efficiency
of hardware encourages higher power consumption for AI. In How much
electricity does AI consume? Alex de Vries, a PhD candidate at VU Amsterdam
who has extensive experience studying the energy usage of GPUs, states “[Increased
hardware efficiency] creates a natural incentive for people to just keep adding
more computational resources, and as soon as models or hardware becomes more
efficient, people will make those models even bigger than before.” The increased
hardware efficiency ironically encourages AI developers to use more resources,
because each efficiency gain enables them to create larger data centers to
support bigger models. No amount of efficiency gains would be able to keep up
with the increasing usage rate to offset the growing energy demand.
Nvidia,
the leading designer and manufacturer of GPUs for AI, recently released the the
H100, which can do three times the work of their previous processor, the A100,
while using the same amount of power. Though the H100 consumes 75% more power
than the A100, it is preferred because of its higher efficiency. It just makes
sense for data centers to use the power-hungry card because they get better
value out of it. In the Forbes article AI Power Consumption: Rapidly
Becoming Mission-Critical, Beth Kindig, the CEO and Lead Tech Analyst for
the I/O Fund, says “Nvidia and other industry executive have laid out a path
for GPU clusters in data centers to scale from tens of thousands of GPUs per
cluster to the hundred-thousand plus range, even up to the millions of GPUs by
2027 and beyond.” The goal is not to reduce power consumption, but to maximize
performance to run bigger AI models. More efficient GPUs simply encourage building
bigger datacenters because it gives them justification to consume even more
power. It is no longer becoming a big deal to increase power consumption by 75%
because performance increases are so large.
As
much as I am personally skeptical of AI as it stands right now, it has value
and real applications that justify its high energy demand. The current growth
rate is alarming. As data centers expand to support AI workloads, there are
concerns on where their energy comes from. Data centers are required to operate
all day everyday all year long, so they most often get their power from fossil
fuels because they provide consistent and dependable power. Unlike current
renewable sources such as solar and wind. It is unsustainable to make AI
dependent on fossil fuels, emissions, global warming and we are likely to run
out of fossil fuels within the next century. If energy demand for AI keeps
growing at its current rate, then it will always be dependent on unsustainable
fossil fuels unless something changes. Until sustainable energy can reliably
can meet AI’s growing energy needs, prioritizing efficiency in AI is essential
to its future.
Sources
How much electricity does AI consume?
https://www.theverge.com/24066646/ai-electricity-energy-watts-generative-consumption
AI already uses as much energy as a
small country. It’s only the beginning.
https://www.vox.com/climate/2024/3/28/24111721/climate-ai-tech-energy-demand-rising
AI Power Consumption: Rapidly
Becoming Mission-Critical
Comments
Post a Comment