Українська правда

Training the Grok 4 AI cost $490 million and consumed as much energy as a city of 4,000 people

- 17 September, 04:16 PM

According to Epoch AI estimates, training Grok 4 cost Elon Musk's xAI company approximately $490 million — nine times more expensive than training Meta Llama 3. This amount is explained by the scale of the infrastructure: the model was trained on the Colossus supercomputer, deployed in Memphis on the basis of the former Electrolux factory.

Colossus is currently considered the largest AI supercomputer in the world. It consists of over 200,000 NVIDIA GPUs (H100, H200 and the latest GB200), with plans to scale to a million GPUs. The data centers are cooled by on-site wastewater infrastructure, and consume hundreds of millions of kW of electricity.

It is estimated that the Grok 4 exercise required about 310 million kWh of electricity (equivalent to the annual consumption of a town of 4,000 people), 750 million liters of water for cooling (300 Olympic-sized swimming pools), and resulted in emissions of about 140,000 tons of CO₂ — the same as a Boeing aircraft in three years of flight.

xAI positions Grok 4 as "the world's smartest model," capable of solving complex mathematical and scientific problems, programming, and working with large context. The Grok 4 Code version is aimed at developers and provides advanced capabilities for autocompletion and debugging of software code.

By the way, after the release of the model, it became known that Grok 4 is guided by Elon Musk's public statements when forming responses to sensitive topics.