Why does it take 700,000 liters of water to train the ChatGPT model?
New research reveals that the massive amount of water consumed during GPT-3 training is enough to produce 370 BMWs or 320 Tesla electric cars .
A team of experts at the University of Colorado Riverside and the University of Texas Arlington studied how water is used in the training of large AI models such as OpenAI's ChatGPT and Google's Bard , Interesting Engineering reported on April 12.
Data centers require huge amounts of water for cooling. (Photo: iStock).
When calculating the amount of water used by AI, the team distinguished between 'withdrawal' and 'consumption.' Withdrawal refers to the taking of water from a river, lake, or other source, while consumption refers to the amount of water lost through evaporation when used in data centers. The new study focused primarily on water consumption. The team said the problem of AI's massive water consumption should be addressed as part of a collective effort to address global water challenges.
The average data center uses about 3.8 liters of water per kWh, new research reveals. And not just any water will do. To prevent corrosion or bacterial growth—which can happen with seawater—data centers draw their water from pristine freshwater sources. Fresh water is also needed to control humidity in the room.
Data centers are also responsible for 'indirect external water consumption,' the scientists said, specifically the water used to generate the huge amounts of electricity that power the centers.
According to the study, large data centers used for AI training require large amounts of water for cooling, which in ChatGPT's case could fill the cooling towers of a nuclear power plant. GPT-3's training process alone required 700,000 liters of water, enough to produce 370 BMWs or 320 Teslas.
According to the new research, a typical user interaction with ChatGPT is equivalent to dumping a large bottle of soda on the ground. ChatGPT, which came after GPT-3, would need to 'drink' a 500 ml bottle of water to complete a basic discussion involving about 25 to 50 user questions.
Microsoft, which has partnered with OpenAI for years, spent billions of dollars, and built supercomputers to train AI, said its latest supercomputer will contain 10,000 graphics cards and more than 285,000 processing cores. The supercomputer will require massive, powerful cooling equipment.
Experts worry that with the popularity of chatbots, the amount of water they consume could harm water supplies, especially given past droughts and future environmental instability.
The new study also estimates the amount of water used to train an AI model, assuming that the training takes place at Microsoft's advanced data center in the US. Water use could triple if the data is generated in a less energy-efficient data center. With more advanced models, such as the recently released GPT-4, relying on more data than older models, the team predicts that water demand will continue to increase.
- 10 reasons to drink water daily
- OpenAI launches Chat GPT-4o with 'close to human' intelligence, 100% free
- Solar panels have the ability to filter water straight from the air
- Close up of the first urban railway train in Hanoi
- The device that produces 38 liters of clean water in an hour from thin air will be a lifesaver for the future human
- Just collect energy and collect clean water
- The road itself absorbs 4,000 liters of flooded water in a minute
- Air purifier for water
- The most popular AI tools of 2023
- The formula helps you know how much water you need to drink each day
- Drinking too much water causes swelling of the brain, causing death
- It can only be Japan: The train knows how to fake dogs and deer to prevent accidents