Artificial intelligence, including ChatGPT, uses more water than expected. In the case of the OpenAI chatbot, the use of water to cool the hardware is four times higher than previously expected. This was reported by San.
Researchers from the University of California, Riverside conducted a study that found that processing 10 to 50 requests in AI-based chatbots, such as ChatGPT, requires up to two liters of water to cool the equipment. Previously, it was believed that this number of requests would use only half a liter.
The significant increase in water consumption is due to the intense cooling needs of data centers that process AI requests. The usage is growing not only for ChatGPT but also for AI products from other tech giants.
Microsoft also said that the need for water and energy for artificial intelligence models was higher than expected. For 2023 and 2024, Microsoft, as well as Google and Meta, reported an increase in water use of 22.5%, 17%, and 17%, respectively.
Although these tech giants are located in the United States, this problem is also inherent in other countries. For example, according to preliminary estimates, data centers in the UK will use as much water as a city the size of Liverpool. And in Ireland, data centers are responsible for 21% of the country's energy consumption.