. . . Data centers need water to cool their processing servers, which is actually a more difficult task in hotter states like Texas. They could use air conditioning to do this, but energy is generally a more expensive commodity than water.
When operational, Stargate will use enough energy to power 750,000 homes. To sustain such a huge demand, OpenAI is building its own natural gas power plant to power Stargate. The emergence of these mega data centers that require their own power plants have become another concern for experts on water resources. . . .
The average, midsized data center uses 300,000 gallons of water a day, roughly the use of a thousand homes. Larger data centers might use 4.5 million gallons a day, depending on their type of water cooling system. Austin has 47 such data centers, while the Dallas-Fort Worth area hosts the majority in Texas at 189. . . .
In a white paper to be released this month, HARC estimates that data centers in Texas will consume 49 billion gallons of water in 2025. They also project that by 2030, that number could rise up to 399 billion gallons, or 6.6% of total water use in Texas. . . .
Texas AI data centers water usage: Amid the ongoing worsening drought condition in Texas, the State’s residents face an unexpected water dilemma: while communities are urged to conserve every drop, even cutting back on simple showers, massive new AI data centers are quietly using millions of gallons daily to keep their operations running, as per a report.
That was my first thought. What are they doing with the hot water?
I’m thinking they could store that heat in salt domes and reuse the cooled water again in their chillers. Then the stored heat could be used for district heating in the winter.
I don’t know why but for some reason when water is used as an industrial coolant , industry likes to cool the water back down again each time, resulting in a steam loss.
Here are some old school cooling towers
(people think these are only for nuclear plants, but they are not)
All data centers reuse their water, and there are various ways to cool the water back down before sending it through the loop again. All of them result in some evaporation.
So not only do they have to replace that water, the remaining water also becomes dense with calcium and eventually has to be replaced (I think. Maybe it can get diluted to a point but eventually it’s just too filled with calcium.)
I’ll try to find the article I read on this a while ago. It’s kind of crazy how much heat these cetners create.
A key reason for the high-water consumption is limited water reuse in cooling. During the cooling process, part of the freshwater evaporates, and the remaining water becomes wastewater. Wastewater is often contaminated with dust, chemicals and minerals, which hamper the efficiency of the cooling process if circulated back. Consequently, data centres are often unable to reuse wastewater to their maximum capacity.