AI Power Use and Efficiency

Artificial Intelligence (AI) systems are becoming increasingly complex and focused, leading to a greater need for powerful analytical tools5, generally to have better applicability to specific needs.  As machine learning models have become more advanced and far more numerous, the computational power required to develop them has doubled every five to six months since about 20105.  That investment means that AI models can now reliably provide language and image recognition, transform audio sounds into analyzable data, chatbots, and automate simple tasks5.

However, the rise of AI has relied on this increasing demand for energy, which has raised concern that it may outweigh its promised positive effects6. Substantial efforts are underway to improve the energy efficiency of modern AI algorithms6, and before energy consumption can be optimized it must be quantified6 as to the root power (and water) consumption.  AI's energy consumption could be far larger, because AI models have been steadily trending upward in size for years and bigger models have required more energy7 for training and inferencing.  On the other hand, companies are leveraging the proven methods of data center efficiency to make these systems do more with less energy — which has so far dampened the upward trend of energy costs7 and been far more efficient per watt than any other building type. 

The energy sector is taking early steps to harness the capabilities of AI to boost efficiency and accelerate innovation1.  The technology is uniquely placed to support the simultaneous growth of smart grids and the massive quantities of data they generate1.  Smart meters produce and send several thousand times more data points to utilities than their analogue predecessors1, and placing a focused AI to sift through and learn to optimize power outputs has already seen gains.  New devices for monitoring grid power flows funnel more than an order of magnitude more data to operators than the technologies they are replacing1.  And the global fleet of wind turbines is estimated to produce more than 400 billion data points per year1 to prove they are producing sustainable, reliable power throughout its lifecycle.  This volume is a key reason energy firms see AI as an increasingly critical resource1.

Cryptocurrency Mining Power Use and Efficiency

Bitcoin mining is a power-hungry process, involving heavy computer calculations to verify transactions8, making such centers the least efficient operating technology today.  It's estimated that Bitcoin consumes electricity at an annualized rate of 127 terawatt-hours (TWh), exceeding the entire annual electricity consumption of Norway6.  Bitcoin uses 707 kilowatt-hours (kWh) of electricity per transaction, which is 11 times that of Ethereum6 and growing. 

Cryptocurrency mining, in general, is energy-intensive, as a rig with three GPUs can consume 1,000 watts of power or more when it's running5.  Crypto mining businesses can have hundreds or even thousands of rigs in one location5 where power is cheap, not caring how it is produced or even if its not regulated.  A mining center in Kazakhstan is equipped to run 50,000 mining rigs, and another mining farm in China has a monthly electricity bill of more than $1 million as it mines less than 750 bitcoins a month5; the payback is becoming smaller every year. 

As of August 2022, Bitcoin alone is estimated to account for 60-77% of global crypto-asset electricity usage11, as it uses a much more energy wasting and less secure proof of work (PoW) protocol, while others use proof of stake (PoS).  According to a White House report, the total energy consumed by Bitcoin mining in 2022 reached 50 billion kilowatt-hours, highlighting the significant scale of energy usage11.

Comparison

Both AI and cryptocurrency mining require substantial amounts of energy, but their energy use patterns differ. AI's energy consumption is tied to the complexity and size of the models, which have been increasing in size and efficiency over time7. On the other hand, cryptocurrency mining's energy consumption is tied to the number of transactions and the computational difficulty of mining5.

While both technologies offer significant benefits, bitcoin energy consumption patterns raise concerns about the cryptocurrency sustainability and environmental impact.  It's crucial to continue researching and implementing strategies to improve the energy efficiency61 of both, as they will each grow yearly as they expand. 

The adaptation of structured, quantitative assessment policies for AI workload energy consumption is an indispensable step to enable sustainable AI development cycles2.  Optimizing energy efficiency will require considering all aspects of AI development, including the hardware, model, and data2 to optimizing the supporting infrastructure supporting this growth. 

Going forward, the only likely way bitcoin will be able to compete with AI efficiency is by changing their protocol to PoS to match their peers.  Even then, the performance characteristics of mining operations versus AI for AI optimized data centers is unlikely to keep pace on compute, facility, or infrastructure efficiency… ever.

References

1. Bitcoin Mining: How Much Electricity It Takes and Why People Are ... - CNET

2. Why Does Bitcoin Use So Much Energy? – Forbes Advisor

3. Tracking electricity consumption from U.S. cryptocurrency mining ...

4. Why AI and energy are the new power couple – Analysis - IEA

5. Bitcoin consumes 'more electricity than Argentina' - BBC

6. 60+ Bitcoin Mining and Energy Statistics Updated for 2024 - Techopedia

7. Reporting electricity consumption is essential for sustainable AI - Nature

8. How much electricity do AI generators consume? - The Verge

9. How AI Can Optimize Energy Efficiency and Reduce Carbon Emissions

10. How Much Power Does Crypto Use? The Government Wants to Know

11. Bitcoin Uses More Electricity Than Many Countries. How Is That Possible ...

footer_logo
© 2022, Green Data Center Guide. All Rights Reserved.