Here are a range of estimates of AI data center water use for California, based mostly on simple fundamental physics of converting energy use to water use for cooling. I did these calculations and then, perhaps appropriately, checked and explored these estimates using four AI models
He guessed, then asked the sycophant machine to tell him he is a smart and clever boy.
To illustrate his point he used a nightmare-inducing AI generated image with talking headless birds and water flowing uphill into both ends of a tube.
Did OP expect this not to downvoted here?
Exactly. How dare someone post anything that threatens Fedi’s visceral, quasi-religious hatred of … neural networks
NN suck. Unless you’re a God programmer then following a non-trvial NN along to it’s output is bonkers. It might as well be spagetti code, written in Latin, in a black box.
Can’t imagine why don’t people love having (their) works used to produce slop competition - how superstitous and irrational! /s
This is not Reddit. Downvotes does not matter.
True, but downvoted posts get even less views.
I’m down voting because even a common HVAC tech could discredit this author into obscurity. This entire article is laced with assumptions probably as a result of improper AI use.
Edit: As a datacenter solutions architect I also can tell you this article is garbage as everything comes down to BTUs. The higher the electrical use, the higher the heat output resulting in the equivalent tons of cooling needed to negate it. Go look up ‘BTU calculator’ if you think I’m full of shit.
Yes, it uses less than the public thinks, but it still uses too much. 1 ounce is too much. AI can fuck right off. It is not entitled to our world.
Slurp gobble.






