Artificial intelligence could consume more energy than Bitcoin mining by the end of 2025
Artificial intelligence could soon surpass Bitcoin mining in terms of electricity consumption, according to a new study in the journal Joule. By the end of 2025, data centers built for AI will account for almost half of all electricity consumption by data centers worldwide, according to research by Alex de Vries-Gao of the Vrije Institute for Environmental Research at the University of Amsterdam, The Verge reports.
De Vries-Gao, known for his research on cryptocurrency energy consumption, estimates that AI is already using up to 20% of data center electricity. To get the real figure, he analyzed the production volumes of specialized AI chips, including TSMC’s data on processor production for NVIDIA and AMD, and compared it with information on their energy consumption. He estimates that AI’s electricity needs will increase to 23 GW – the equivalent of the consumption of all UK data centers.
That’s almost double what the Bitcoin mining network currently uses (about 10 GW). The surge in demand is already forcing U.S. utilities to plan new gas-fired power plants and even restart nuclear power projects. Experts warn that the sudden surges could disrupt grid stability and slow the transition to renewable energy, a situation similar to that surrounding massive crypto mining farms.
Consulting firm ICF also predicts that by 2030, electricity demand from US data centers will grow by 25%, with AI and classic cloud services accounting for the bulk of this growth.
Accurate calculations of AI energy consumption are complicated by a lack of transparency on the part of big tech corporations: they do not detail in their reports what part of emissions or consumption is attributable to AI. “It is unacceptably difficult to get even a rough figure,” says de Vries-Gao. He calls for mandatory disclosure of this data so that governments can formulate informed energy and climate policies.
Some startups, such as DeepSeek, claim that their models require tens of times less energy than commercial counterparts. But even in these conditions, there is a risk of the Jephson paradox: increased efficiency can encourage even greater use of AI and, in general, increase total energy consumption.