Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror

Comment Re:Kurzweils Singularity. (Score 5, Informative) 140

Life is WAY better after the industrial revolution than it was before it.

People have this fantasy image of what life used to be like, thinking of picturesque farms, craftsmen tinkering in workshops, clean air, etc. The middle ages were filth, you worked backbreaking labour long hours of the day, commonly in highly risky environments, even the simplest necessities cost a large portion of your salary, you lived in a hovel, and you died of preventable diseases at an average age of ~35 (a number admittedly dragged down by the fact that 1/4th of children didn't even survive a single year).

If it takes people of similar social status as you weeks of labour to produce the fibre for a set of clothes, spin it into yarn, dye it, weave it, and sew it, then guess what? It requires that plus taxes and profit weeks of your labour to be able to afford that set of clothes (and you better believe the upper classes were squeezing every ounce of profit from the lower class they could back then). Decreasing the amount of human labour needed to produce things is an immensely good thing. Furthermore, where did that freed up labour go? Into science, into medicine, into the arts, etc etc. Further improving people's quality of life.

And if your response is "But greater production is more polluting!" - I'm sorry, do you have any understanding of how *miserably* polluted cities in the middle ages were? Where coal smoke poured out with no pollution controls, sewage ran straight into rivers that people collected water from and bathed in, where people extensively used things like arsenic and mercury and lead and asbestos, etc etc? The freed-up labour brought about by the industrial revolution allowed us to *learn* and to *fix problems*.

Comment Re:No it isn't (Score 2) 140

Which of those things hit 800 million users in 17 months?
Which of those things hit such high annual recurring revenue rates so fast?
Which of those saw the cost of using the tech decline by 99% over two years?

Heck, most of them aren't even new technologies, just new products, often just the latest version of older, already-commonly-used products.

And re. that last one: it must be stressed that for the "cost of using the tech" to decline 99% over two years per million tokens, you're also talking about a similar order of reduction of power consumption per million tokens, since your two main costs are hardware and electricity.

Comment Re:Dig Baby Dig (Score 4, Informative) 140

You're looking at three months of very noisy data and drawing some pretty dramatic conclusions from said minimal data.

Winter demand is heavily dependent on weather. You're mainly seeing the impacts of weather on demand.

https://ancillary-proxy.atarimworker.io?url=https%3A%2F%2Fen.wikipedia.org%2Fwiki%2F2024%25E2%2580%259325_North_American_winter

"The 2024–25 North American winter was considerably colder then the previous winter season, and much more wintry across the North American continent, signified by several rounds of bitterly cold temperatures occurring."

Comment Re:Sshhhhhh. We're not supposed to say that. (Score 2) 95

Or, you can, you know, not fall for clickbait. This is one of those...

Ultimately, we found that the common understanding of AI’s energy consumption is full of holes.

"Everyone Else Is Wrong And I Am Right" articles, which starts out with....

The latest reports show that 4.4% of all the energy in the US now goes toward data centers.

without bothering to mention that AI is only a small percentage of data centre power consumption (Bitcoin alone is an order of magnitude higher), and....

In 2017, AI began to change everything. Data centers started getting built with energy-intensive hardware designed for AI, which led them to double their electricity consumption by 2023.

What a retcon. AI was *nothing* until the early 2020s. Yet datacentre power consumption did start skyrocketing in 2017 - having nothing whatsoever to do with AI. Bitcoin was the big driver.

At that point, AI alone could consume as much electricity annually as 22% of all US households.

Let's convert this from meaningless hype numbers to actual numbers. First off, notice the fast one they just pulled - global AI usage to just the US, and just households. US households use about 1500 TWh of the world's 24400 TWh/yr, or about 6%. 22% of 6% is ~1,3% of electricity (330 TWh/yr). Electricity is about 20% of global energy, so in this scenario AI would be 0,3% of global energy.

We're just taking at face value their extreme numbers for now (predicting an order of magnitude growth from today's AI consumption), and ignoring that even a single AI application alone could entirely offset the emissions of all AI combined.

Let's look first at the premises behind what they're arguing for this 0,3% of global energy usage (oh, I'm sorry, let's revert to scary numbers: "22% OF US HOUSEHOLDS!"):

* It's almost all inference, so that simplifies everything to usage growth

* But usage growth is offset by the fact that AI efficiency is simultaneously improving at faster than Moore's Law on three separate axes, which are multiplicative with each other (hardware, inference, and models). You can get what used to take insanely expensive, server-and-power-hungry GPT-4 performance (1,5T parameters) on a model small enough to run on a cell phone that, run on efficient modern servers, finishes its output in a flash. So you have to assume not just one order of magnitude of inference growth (due to more people using AI), but many orders of magnitude of inference growth.

  * You can try to Jevon at least part of that away by assuming that people will always want the latest, greatest, most powerful models for their tasks, rather than putting the efficiency gains toward lower costs. But will they? I mean, to some extent, sure. LRMs deal with a lot more tokens than non-LRMs, AI video is just starting to take off, etc. But at the same time, for example, today LRMs work in token space, but in the future they'll probably just work in latent space, which is vastly more efficient. To be clear, I'm sure Jevon will eat a lot of the gains - but all of them? I'm not so sure about that.

  * You need the hardware to actually consume this power. They're predicting by - three years from now - to have an order of magnitude more hardware out there than all the AI servers combined to this point. Is the production capacity for that huge level of increase in AI silicon actually in the works? I don't see it.

  * You need someone to actually pay for it. In power alone, they're talking ~$33B per year. That reflects ~$100B/yr in total costs. This isn't funded. Buried in Trump's "$500B" Stargate announcement is that there's actually only funding for $100B... total.... over half a decade. The sort of costs speculated about in this article require that paying demand materialize. If you believe these numbers, then you also have to believe that AI will be wildly successful, producing economic output worth at least hundreds of billions of dollars (a fraction of which would go to paying for the AI itself), aka a second industrial revolution.

To recap, to buy these numbers, you have to believe that AI will be immensely successful, kickstart a second industrial revolution, and Jevon's Paradox will minimal. You also have to entirely write off any efficiency gains caused by AI programs that would offset its 0,3%-of-global-energy footprint. You have to believe that the production rate for AI servers - all of the chip fabs, etc - will grow the total number of servers by an order of magnitude in three years.

Do you actually believe all of these premises? Then sure, buy into this article's numbers. One certainly can accept these premises. But do you? I always find it odd seeing how many people think nobody wants AI and it's a massive bubble that waiting to collapse, and yet simultaneously buy into the most extreme energy projections. These two concepts are mutually exclusive. Pick one.

Submission + - Signal declares war on Microsoft Recall with screenshot blocking on Windows 11 (betanews.com)

BrianFagioli writes: Signal has officially had enough, folks. You see, the privacy-first messaging app is going on the offensive, declaring war on Microsoft’s invasive Recall feature by enabling a new “Screen security” setting by default on Windows 11. This move is designed to block Microsoft’s AI-powered screenshot tool from capturing your private chats.

If you aren’t aware, Recall was first unveiled a year ago as part of Microsoft’s Copilot+ PC push. The feature quietly took screenshots of everything happening on your computer, every few seconds, storing them in a searchable timeline. Microsoft claimed it would help users “remember” what they’ve done. Critics called it creepy. Security experts called it dangerous. The backlash was so fierce that Microsoft pulled the feature before launch.

But now, in a move nobody asked for, Recall is sadly back. And thankfully, Signal isn’t waiting around this time. The team has activated a Windows 11-specific DRM flag that completely blacks out Signal’s chat window when a screenshot is attempted. If you’ve ever tried to screen grab a streaming movie, you’ll know the result: nothing but black.

Comment Re:multiple CS experts have told me (Score 1) 260

I could readily point you to quite a few biologists studying the brain who use LLMs extensively as comparison points in their research (and spiking ANNs are the gold standard model in the field, though they're not as computationally efficient), but thankfully we have TheMiddleRoad on Slashdot to tell us what's what.

And for the last time, either read the actual paper or shut up.

Comment Re:This is nonsensical. (Score 1) 178

Spinning power does that quite well.

Nuclear is not "spinning power" because it runs at basically 100% all the time except when down for maintenance, as capital costs are so high but operating costs are low. It can't "spin up" because it's either already up, or down for maintenance. If you want to have it down for most of the time, then you have to multiply its already insanely high costs severalfold.

. As for dispatchable power for peaking, renewables are quite capable of doing that,

No. For god's fucking sake. You CANNOT DISPATCH WIND AND SOLAR. They're on when they're on, and off when they're off. When they're on, you never want to have them disconnected, because they're giving you free power. And you don't have shortages when they're on. You have shortages when they're off. And you cannot make them go on, because you can't make the sun start shining or the wind start blowing.

The shortages that require peaking are when the sun isn't shining and the wind isn't blowing. You cannot use wind and solar for that. For god's fucking sake.

Comment Re:This is nonsensical. (Score 1) 178

Peaking is, by definition, dispatchable power, which can be created precisely when you need it to fillin gaps, and shut off when you don't. The gaps under discussion of which created by the lack of available wind and solar power at those times.

Stop calling nondispatchable highly-variable power "peaking". It's just plain wrong, and it just makes you look profoundly ignorant on the topic you're insisting on discussing.

Comment Re:This is nonsensical. (Score 2) 178

However, you can run the nuke as base load for grid stability and peak with renewables.

Renwables (wind and solar) are literally the opposite of peaking. They are entirely nondispatchable.

And I'll repeat: renewables do not need base load. Renewables provide incredibly cheap, but unstable, power. They need peaking. Which is not nuclear.

Slashdot Top Deals

In the realm of scientific observation, luck is granted only to those who are prepared. - Louis Pasteur

Working...