You yourself clearly don't even believe what you're writing. You keep getting mad at the claim that AI isn't (present tense) a large chunk of data centre power usage and pushing back against it, but then trying to counter that by talking about growth trends. Either you yourself (A) understand the fact that AI *isn't* where most datacentre power is used and are pointing at trends as a distraction, or (B) you don't understand the difference between a fraction and a growth trend.
Your own citations furthermore undercut your own claims.
From 2005 to 2017, the amount of electricity going to data centers remained quite flat thanks to increases in efficiency, despite the construction of armies of new data centers to serve the rise of cloud-based online services, from Facebook to Netflix. In 2017, AI began to change everything
No, it didn't. Datacentre usage started spiking in 2017, long before AI was any sort of meaningful "thing". GPT 2 came out in 2019, and that's a mere 1,5B parameter model (you can train it today for about $10). Even with much worse hardware and software performance at the time, the energy needed to train it and other AI models at the time was utterly immeasurable compared to other server needs. Server power usage was spiking and AI had nothing to do with it. It was spiking because of Bitcoin.
Yes, AI has been on an exponential growth trend. You love to fast foward trends into some questionable future, while forgetting that rewinding an exponential trend makes it rapidly diminish toward irrelevance. AI power consumption TODAY is about 40TWh/yr, vs. Bitcoin alone at 155-172 TWh per year. Since AI is growing faster than Bitcoin, that already extreme ratio in favour of Bitcoin (and other crypto) becomes far more dramatic in the past. It's also not just crypto that was pushing the post-2017 data centre growth trend, mind you, even just general cloud services have been exploding in demand since then.
If you don't like their terminology, let's say "consumer AI" to avoid the technical distinction of what is or isn't a generative model.
But that's always how things work. Consumer purchases drive the advancement of new tech, and that new tech gets applied more broadly. EVs didn't create the market for the li-ion batteries that powered then - consumers buying laptops and cell phones and toys with li-ion batteries created the market for li-ion batteries, and then EVs took advantage of that market (now they drive it, but that's not how it got established. You simply cannot divorce the usages; they're inexorably linked.
Nor can you simply write off consumer usages, because again, as I noted, they're driving productivity. And productivity includes the development of software and systems that are more efficient than older software and systems. Just the other day Slashdot was all talking about an AI agent that caused a stir when it submitted a patch to matplotlib and then wrote a blog post when its patch was rejected because it wasn't human. Do you remember what that patch was? It improved performance by ~30% by switching to contiguous memory ops. This is happening all over the economy (just usually with far less drama than that).
Your response begins not by engaging the substance of the report, but by questioning whether I read it and by attacking the credibility of the authors.
Key words: "Began with". Yes, the authors' background absolutely does need to be mentioned; they're not even close to an unbiased source that should just be taken at face value. But my post only began with that; it then went into great detail about the content of their "report".