Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror

Comment Re:Kurzweils Singularity. (Score 5, Informative) 157

Life is WAY better after the industrial revolution than it was before it.

People have this fantasy image of what life used to be like, thinking of picturesque farms, craftsmen tinkering in workshops, clean air, etc. The middle ages were filth, you worked backbreaking labour long hours of the day, commonly in highly risky environments, even the simplest necessities cost a large portion of your salary, you lived in a hovel, and you died of preventable diseases at an average age of ~35 (a number admittedly dragged down by the fact that 1/4th of children didn't even survive a single year).

If it takes people of similar social status as you weeks of labour to produce the fibre for a set of clothes, spin it into yarn, dye it, weave it, and sew it, then guess what? It requires that plus taxes and profit weeks of your labour to be able to afford that set of clothes (and you better believe the upper classes were squeezing every ounce of profit from the lower class they could back then). Decreasing the amount of human labour needed to produce things is an immensely good thing. Furthermore, where did that freed up labour go? Into science, into medicine, into the arts, etc etc. Further improving people's quality of life.

And if your response is "But greater production is more polluting!" - I'm sorry, do you have any understanding of how *miserably* polluted cities in the middle ages were? Where coal smoke poured out with no pollution controls, sewage ran straight into rivers that people collected water from and bathed in, where people extensively used things like arsenic and mercury and lead and asbestos, etc etc? The freed-up labour brought about by the industrial revolution allowed us to *learn* and to *fix problems*.

Comment Re:No it isn't (Score 2) 157

Which of those things hit 800 million users in 17 months?
Which of those things hit such high annual recurring revenue rates so fast?
Which of those saw the cost of using the tech decline by 99% over two years?

Heck, most of them aren't even new technologies, just new products, often just the latest version of older, already-commonly-used products.

And re. that last one: it must be stressed that for the "cost of using the tech" to decline 99% over two years per million tokens, you're also talking about a similar order of reduction of power consumption per million tokens, since your two main costs are hardware and electricity.

Comment Re:Dig Baby Dig (Score 4, Informative) 157

You're looking at three months of very noisy data and drawing some pretty dramatic conclusions from said minimal data.

Winter demand is heavily dependent on weather. You're mainly seeing the impacts of weather on demand.

https://ancillary-proxy.atarimworker.io?url=https%3A%2F%2Fen.wikipedia.org%2Fwiki%2F2024%25E2%2580%259325_North_American_winter

"The 2024–25 North American winter was considerably colder then the previous winter season, and much more wintry across the North American continent, signified by several rounds of bitterly cold temperatures occurring."

Comment Re:Absolutely (Score 1) 46

Seen Youtube lately? I just watched a video on how to make nitroglycerin. Stuff like this has been available for over a decade.

Back in the days that home solar systems still mostly used lead-acid batteries - which in some cases of degradation could be repaired, at least partially, if you had some good strong and reasonably pure sulfuric acid - I viewed a YouTube video on how to make it. (From epsom salts by electrolysis using a flowerpot and some carbon rods from old large dry cells).

For months afterward YouTube "suggested" I'd be interested in videos from a bunch of Islamic religious leaders . (This while people were wondering how Islamic Terrorists were using the Internet to recruit among high-school out-group nerds.)

Software - AI and otherwise - often creates unintended consequences. B-)

Comment Re:Sshhhhhh. We're not supposed to say that. (Score 2) 95

Or, you can, you know, not fall for clickbait. This is one of those...

Ultimately, we found that the common understanding of AI’s energy consumption is full of holes.

"Everyone Else Is Wrong And I Am Right" articles, which starts out with....

The latest reports show that 4.4% of all the energy in the US now goes toward data centers.

without bothering to mention that AI is only a small percentage of data centre power consumption (Bitcoin alone is an order of magnitude higher), and....

In 2017, AI began to change everything. Data centers started getting built with energy-intensive hardware designed for AI, which led them to double their electricity consumption by 2023.

What a retcon. AI was *nothing* until the early 2020s. Yet datacentre power consumption did start skyrocketing in 2017 - having nothing whatsoever to do with AI. Bitcoin was the big driver.

At that point, AI alone could consume as much electricity annually as 22% of all US households.

Let's convert this from meaningless hype numbers to actual numbers. First off, notice the fast one they just pulled - global AI usage to just the US, and just households. US households use about 1500 TWh of the world's 24400 TWh/yr, or about 6%. 22% of 6% is ~1,3% of electricity (330 TWh/yr). Electricity is about 20% of global energy, so in this scenario AI would be 0,3% of global energy.

We're just taking at face value their extreme numbers for now (predicting an order of magnitude growth from today's AI consumption), and ignoring that even a single AI application alone could entirely offset the emissions of all AI combined.

Let's look first at the premises behind what they're arguing for this 0,3% of global energy usage (oh, I'm sorry, let's revert to scary numbers: "22% OF US HOUSEHOLDS!"):

* It's almost all inference, so that simplifies everything to usage growth

* But usage growth is offset by the fact that AI efficiency is simultaneously improving at faster than Moore's Law on three separate axes, which are multiplicative with each other (hardware, inference, and models). You can get what used to take insanely expensive, server-and-power-hungry GPT-4 performance (1,5T parameters) on a model small enough to run on a cell phone that, run on efficient modern servers, finishes its output in a flash. So you have to assume not just one order of magnitude of inference growth (due to more people using AI), but many orders of magnitude of inference growth.

  * You can try to Jevon at least part of that away by assuming that people will always want the latest, greatest, most powerful models for their tasks, rather than putting the efficiency gains toward lower costs. But will they? I mean, to some extent, sure. LRMs deal with a lot more tokens than non-LRMs, AI video is just starting to take off, etc. But at the same time, for example, today LRMs work in token space, but in the future they'll probably just work in latent space, which is vastly more efficient. To be clear, I'm sure Jevon will eat a lot of the gains - but all of them? I'm not so sure about that.

  * You need the hardware to actually consume this power. They're predicting by - three years from now - to have an order of magnitude more hardware out there than all the AI servers combined to this point. Is the production capacity for that huge level of increase in AI silicon actually in the works? I don't see it.

  * You need someone to actually pay for it. In power alone, they're talking ~$33B per year. That reflects ~$100B/yr in total costs. This isn't funded. Buried in Trump's "$500B" Stargate announcement is that there's actually only funding for $100B... total.... over half a decade. The sort of costs speculated about in this article require that paying demand materialize. If you believe these numbers, then you also have to believe that AI will be wildly successful, producing economic output worth at least hundreds of billions of dollars (a fraction of which would go to paying for the AI itself), aka a second industrial revolution.

To recap, to buy these numbers, you have to believe that AI will be immensely successful, kickstart a second industrial revolution, and Jevon's Paradox will minimal. You also have to entirely write off any efficiency gains caused by AI programs that would offset its 0,3%-of-global-energy footprint. You have to believe that the production rate for AI servers - all of the chip fabs, etc - will grow the total number of servers by an order of magnitude in three years.

Do you actually believe all of these premises? Then sure, buy into this article's numbers. One certainly can accept these premises. But do you? I always find it odd seeing how many people think nobody wants AI and it's a massive bubble that waiting to collapse, and yet simultaneously buy into the most extreme energy projections. These two concepts are mutually exclusive. Pick one.

Comment Re:multiple CS experts have told me (Score 1) 261

I could readily point you to quite a few biologists studying the brain who use LLMs extensively as comparison points in their research (and spiking ANNs are the gold standard model in the field, though they're not as computationally efficient), but thankfully we have TheMiddleRoad on Slashdot to tell us what's what.

And for the last time, either read the actual paper or shut up.

Comment Re:This is nonsensical. (Score 1) 178

Spinning power does that quite well.

Nuclear is not "spinning power" because it runs at basically 100% all the time except when down for maintenance, as capital costs are so high but operating costs are low. It can't "spin up" because it's either already up, or down for maintenance. If you want to have it down for most of the time, then you have to multiply its already insanely high costs severalfold.

. As for dispatchable power for peaking, renewables are quite capable of doing that,

No. For god's fucking sake. You CANNOT DISPATCH WIND AND SOLAR. They're on when they're on, and off when they're off. When they're on, you never want to have them disconnected, because they're giving you free power. And you don't have shortages when they're on. You have shortages when they're off. And you cannot make them go on, because you can't make the sun start shining or the wind start blowing.

The shortages that require peaking are when the sun isn't shining and the wind isn't blowing. You cannot use wind and solar for that. For god's fucking sake.

Comment Re: no (Score 1) 74

Yeah ok.

I do not know what world you live in but I have never seen a Linux desktop at work in my 30 years in the workforce. I have seen some ipads coming in for stuff like warehouse workers.

MDM like Intune or JamF is great for locking stuff down and rolling out apps on devices like tablets and even Windows desktops.

Until Excel, Quickbooks, Autocad, and every business software in existence gets ported Linux is not an option.

Comment Re:They did WSL totally backward. (Score 2) 74

Crazy people still think WIndows is like Dos based WIndowsME/98 and thinks have not progressed in a quarter century.

If Windows was so bad and insecure then why does corporate America use and trust to secure their data and run their apps?

Linux is not an option for 97% of people as their first time OS. I used to use Linux 25 years ago. Today I want to get work done and run games and have something just work. No nvidia wayland issues. Hardware accelerated smooth scroll and anti alaisgned fonts. Chrome goes blip blip blip on Linux when I scroll up and down. Multi monitor support is even worse. Do not let me go on about the insecurity and horrors of Xorg.

Before I get accused of being a MS fanboy and modded -1 to infinity I want to say I chose this username name back in 2000 as I was a MS hater like the rest of you when I was young. I grew up.

I hate all operating systems now including WIndows BTW ... but for different reasons :-D ... since I am old and middle aged.

Linux is great and useful for dev and cloud stuff. Windows is great for multi monitor setup and boring win32 business apps. Android/IOS for content which does support smooth scrolling and fluid animations and fonts like we are in 2007 and later. I do not want linux as a host OS or a desktop or troubleshooting my own system every weekend trying to get a proton port of a steam game.

WSL is amazing and gets the job done. Without it I would have no tools at work. We must use Windows on our desktops.

Comment Re:This is nonsensical. (Score 1) 178

Peaking is, by definition, dispatchable power, which can be created precisely when you need it to fillin gaps, and shut off when you don't. The gaps under discussion of which created by the lack of available wind and solar power at those times.

Stop calling nondispatchable highly-variable power "peaking". It's just plain wrong, and it just makes you look profoundly ignorant on the topic you're insisting on discussing.

Slashdot Top Deals

"One day I woke up and discovered that I was in love with tripe." -- Tom Anderson

Working...