All those AI chatbots need a lot of water to stay afloat – Android Police

When it comes to ChatGPT, the ‘T’ stands for ‘thirsty’
Open AI’s ChatGPT popularized generative AI with its easy-to-use chatbot interface, and within a year, rival AIs like Google Bard and Microsoft Bing found a home in everything from Workspace apps to Google Chrome. It can be quite expensive and resource intensive to compute responses and train large language models like Open AI’s GPT3.5, which powers the free version of ChatGPT. The chatbot is also available as an Android app now, making AI more accessible. However, this computation also draws significant amounts of power and generates a lot of heat as a byproduct, and your favorite AI might be guzzling a lot more water than you imagined.
Microsoft invested its first billion in Open AI back in 2019, two years before ChatGPT took the world by storm. The tech giant’s data centers were used to train GPT4 on enormous amounts of human-generated content. In 2020, Microsoft revealed that the supercomputer for this task used 285,000 cores of conventional semiconductors and 10,000 graphics processors, which are critical for AI workloads. Understandably, such hardware would generate a lot more heat at full tilt than your average Android phone, which stays cool with heat dissipation through the back panel. Carrying this heat away at scale usually requires water or other liquid coolants pumped through at a mammoth proportions.
A recent Fortune report on the environmental impact of AI chatbots mentions that Microsoft’s usage of water for cooling data centers spiked 34% year-on-year, between 2021 and 2022 (via Windows Central). It now stands at a whopping 1.7 billion gallons of water, or as much as 2,500 Olympic swimming pools. Recently, Microsoft’s vice chair and President Brad Smith revealed GPT-4 was “made” at the data center in Iowa, cooled by water from the Raccoon and Des Moines rivers. Microsoft claims it uses water only when the temperature soars past 85 °F, but even then, the company pumped 11.5 million gallons of water into the cluster a few months before GPT-4 was finished — that’s 6% of the entire district’s water consumption.
Understandably, the Iowa city is apprehensive about housing any more Microsoft projects until the company promises to reduce its peak water consumption from current levels. That’s despite Microsoft’s significant contribution to the city in the form of taxes which support its investment. A University of California researcher estimates that on a per-use basis, Open AI uses 16 ounces of water (around 500ml) for every five to 50 queries users ask the AI. The statistic accounts for the water utilized in power generation for the data center as well, but the variation is because of seasonal temperatures affecting the cooling requirements, and the location of Microsoft’s facilities handling the request.
For instance, the Iowa facility is among the coolest running, with the Arizona data center using much more water for a similar output, just because the ambient temperatures there are higher. However, the reckless use of natural resources by multinationals is also accelerating climate change. Despite the initial secrecy surrounding the company’s environmental impact, annualized reports and a promise of being carbon negative and water positive by 2030 have helped Microsoft come clean. We just hope the company spreads awareness about the environmental impact of AI that’s mostly free to use.
Chandraveer is a mechanical design engineer with a passion for all things Android including devices, launchers, theming, apps, and photography. When he isn’t typing away on his mechanical keyboard’s heavy linear switches, he enjoys discovering new music, improving his keyboard, and rowing through his hatchback’s gears on twisty roads. 

source

Jesse
https://playwithchatgtp.com