AI’s a weird industry. So far almost no one is making any money, certainly not the major Western AI companies: Anthropic and OpenAI. Every query costs more than the revenue it generates. The primary beneficiary has been NVidia: they’re making money hand over fist, and suppliers of data centers and power have big customers in AI. But AI itself doesn’t make money. (Not Western, anyway. Deepseek, which is 20 to 30 times cheaper, probably is.

The energy required for Western AI is huge, and it’s mostly dirty energy. AI requires mostly 24/7 energy, which means renewables are out. It needs nuclear or carbon intensive sources like coal and natural gas and turbines. MIT did a massive dig into this in March.

The researchers were clear that adoption of AI and the accelerated server technologies that power it has been the primary force causing electricity demand from data centers to skyrocket after remaining stagnant for over a decade. Between 2024 and 2028, the share of US electricity going to data centers may triple, from its current 4.4% to 12%.

AI companies are also planning multi-gigawatt constructions abroad, including in Malaysia, which is becoming Southeast Asia’s data center hub. In May OpenAI announced a plan to support data-center buildouts abroad as part of a bid to “spread democratic AI.” Companies are taking a scattershot approach to getting there—inking deals for new nuclear plants, firing up old ones, and striking massive deals with utility companies.

Nature came up with this chart. As they note, it’s lower bound, because if it was too high, AI companies would have said so.

AI’s a lot more intensive than traditional methods. For example, AI vs. a Google search (granted Google search sucks, but that’s because Google wants it to suck.)

It’s long been noted that one of the biggest issues with climate change is that we can expect it to reduce the amount of fresh water available. AI gobbles that:

AI is also thirsty for water. ChatGPT gulps roughly a 16-ounce bottle in as few as 10 queries, calculates Shaolei Ren, associate professor of electrical and computer engineering at UC Riverside, and his colleagues.

 

 

But here’s the kicker:

ChatGPT 5 power consumption could be as much as eight times higher than GPT 4 — research institute estimates medium-sized GPT-5 response can consume up to 40 watt-hours of electricity

Whoa! That kind of puts paid to rising by 10% a year and other such assumptions. It doesn’t look like new models are scaling linearly.

We have a climate change problem already: lots of extreme weather, disrupted rainfall patterns and massive wildfires. The permafrost is bubbling and releasing methane and arctic temperatures are absurd (hitting 30 celcius in some cases).

Now if this tech was truly transformative, if it made everything so much better, maybe it would be worth it. But so far, with a few exceptions (mostly running thru millions of combinations to assist research) it seems like it’s better search, automatic image generation, a great way for students to cheat and may make programming faster. (There’s some dispute about this, one study found it made coders slower.) So far agents are duds, unable to even run a vending machine.

On the downside, even AI boosters claim it’s likely to put vast numbers of people out of work if it does work, wiping out entire fields of employment, including SFX, illustrators, artists, writers, customer service and perhaps most entry level jobs. We’re told AI has a small but existential risk of wiping out humanity. It gobbles water and energy and causes pollution.

What, exactly, are we expecting to get from AI (other than NVidia making profits) that is worth the costs of AI? Does it make sense to be rushing forward this fast, and in this way? Deepseek has shown AI doesn’t have to use so many resources, but Western AI companies are doing the opposite of reducing their resource draw. Eight times as much energy? How much more energy with GPT-6 use?

It seems like we’re unable to control our tech at all. This used to be the killer argument “well, there’s no controlling it, so why even try?”

But China’s AI uses way less energy. Apparently China can control it, and we can’t? So it’s not about “can’t”, it’s about “won’t”. Using less resources would mean less money sloshing around making various Tech-bros rich, I guess, and we can’t have that.

And all this for an industry where the primary actors, OpenAI and Anthropic aren’t even making money.

Perhaps we could be using these resources in a better way? China is spending their money on producing three-quarters of the world’s renewable energy, and ramping up nuclear power. Their carbon emissions are actually down. Their economy is growing far faster than ours. They’ve almost completely moved over to electric cars, they have high speed trains, and their space program is going gangbusters. All this while reducing rent by over a third in the past five years.

You don’t have to be an AI skeptic to think “maybe this is a misallocation of resources?” Is it really going to change everything so much so that it “makes America great again”? Is western AI so much better than Chinese to make that difference even if AI is as big a deal as its greatest boosters say?

Maybe the US and Europe should be concentrating on more than just AI? Not letting China continue to march ahead in almost every field, while putting almost all the marbles on one big project that they barely have a lead in anyway?

I don’t want to overstate this issue. The amount of energy and water used doesn’t come close to, say, expected increases in air conditioning. (Though if increases in draw continue to ramp up similar to GPT-5 we’ll see. And, the more energy we use, the more air conditioning we need thanks to fairly obvious feedback.) But still, what are we getting for it?

Just some things to think about.

***

If you’ve read this far, and you read a lot of this site’s articles, you might wish to Subscribe or donate. The site has over over 3,500 posts, and the site, and Ian, take money to run.