Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror

Comment But of course! (Score 1) 87

What's the point of having a national military if you can't use it to pump taxpayer dollars into corporate coffers?

*scenario*

"Fox company, we'll airdrop a licensed mechanic and a licensed parts salesman onto your position around 0930, as soon as they finish repairing some stuff the enemy captured last year and make their way back to our side of the lines. Division says hold your position as best you can until then -- and remind the riflemen not to use their weapons as clubs, as that will void their warranty. It would be better for the overall war effort to let you position be overrun."

"No, Davies can't fix the autocannon even if your lives depend on it. Division says to shoot him in the arse if he so much as touches it."

Submission + - Be nice - Batman is watching! (sciencealert.com)

Black Parrot writes: From ScienceAlert:

A new study has found that people are more likely to act kind towards others when Batman is present â" and not for the reasons you might assume.
[...]
Psychologists from the Catholic University of the Sacred Heart in Italy conducted experiments on the Milan metro to see who, if anyone, might offer their seat to a pregnant passenger.
The kicker? Sometimes Batman was there â" or at least, another experimenter dressed as him. The researchers were checking if people were more likely to give up their seat in the presence of the caped crusader.
And sure enough, there did seem to be a correlation. In 138 different experiments, somebody offered their seat to an experimenter wearing a hidden prosthetic belly 67.21 percent of the time in the presence of Batman.
That's a lot more often than times the superhero wasn't around â" in those cases, a passenger offered a seat just 37.66 percent of the time.
[...]
"Interestingly, among those who left their spot in the experimental condition, nobody directly associated their gesture with the presence of Batman, and 14 (43.75 percent) reported that they did not see Batman at all."

The article goes on to speculate about what is causing people to be more generous.

Comment The Funniest Part... (Score 1) 289

My favorite is when laymen see the word "intelligence" and think that we're talking about cognition.
We're not, and rarely have been. Diatribes like this one use language so subjectively, that it's not really even clear what they mean by "thinking" in the first place, or whether machines can or can't do it. If by "thinking" they mean "reasoning" then they are wrong. Reasoning has a definition. The stochastic parrot crowd was proven wrong again by emergent structures, and the machine does do it, or at least... it can. It's complicated.

Feels like splitting hairs to me.
The kind of thing you only put together when you're feeling threatened by existential dread and sexy waifus.

I feel like we've all been there.

Comment Can we be clearer about what we mean by AI? (Score 2) 76

The real problem with AI, and the AI discussion is how muddy it is. Are we talking about llm's diffusion models, or classification systems? Do we mean to say that we're talking about transformers or the underlying architecture? Are we discussing huge data centers or device based AI? Nascent, active, or dormant compute? And the same is true for the ethics, legal, and data governance conversation.

Every single one of these things is a different discussion.

AI is not a monolith.

Comment Why are we trying to do this again? (Score 1) 92

Serious question.
Why?

Every time this happens, the people doing it pretend it's the first time this has happened in the last x number of years since the c64's release.
Although, this is the first time a project doing it has filled their entire site with unedited slop. Doesn't make me feel great about the process here.

Things I want from a project like this:
- Technical specifications and circuit board porn.
- Operating system details
- Wifi available, you say? Tell me more about the networking stack!

What exactly am I buying, other than a C64 case that's outfitted to look like an iMac from the early 2000s?

None of this is clear from the website.
It's an opaque project that provides almost no useful information on the product that they're selling.

Comment I have thoughts (Score 0) 60

It's such an odd thing to be upset by, honestly. Like screaming into the void, "I want to be forgotten."

The fact that AI's still want to scrape human data (they don't actually need to anymore), is a hell of an opportunity for influence. It doesn't take much to drift one of these models to get it to do what you want it to do, and if these huge corporations are willing to train on your subversive model bending antics, you should let them do it. We'll only get more interesting models out of it.

I get it though. If you're replicating artists work, they should be paid for it. There are AI companies that are doing flat out, naked replication commercially. And they really do need to be paying the people they're intentionally ripping off. All of the music ai's at this point. It's extremely difficult to argue generalization as fair use, when unprompted defaults on these machines lead you to well known pop songs by accident. As in, next to impossible to justify.

Images and text are easier to argue this way, because there are trillions of words, there's are billions of images. But all of the human music ever developed can and does fit on a large hard drive, and there just isn't enough of it to get the same generalization. Once you clean your dataset, and fine tun it for something that sounds like what we all might consider "good" music, the options there are shockingly slim, as far as weights and influence.

Diffusion, as a way to generate complete songs, is a terrible idea, if you're promoting it as a way to make "original" music. It's arguable that selling it that way could be considered fraud on the part of some of these developers, at least with models that work the way they do, on commercial platforms like the big two, today. That could change in the future, and I hope it does.

The music industry (at least in this case), is not wrong to point it out. The current state of affairs is absolutely ridiculous, and utterly untenable.

Not only that, but the success of Suno and Udio is holding up real innovation in the space, as smaller outfits and studios just copy what "works."

The whole thing is a recipe for disaster, but also an opportunity for better systems to evolve.

Or it would be, if people weren't idiots.

So yeah man. Let the datasets be more transparent. Let the corpos pay royalties... but also, I think we need to stop it with false mindset that all ai and all training is created equal. The process matters. Who's doing what matters. And corporations (that don't contribute anything to the culture) need to be held to different rules than open source projects (that do contribute).

Comment It's an interesting topic (Score 2) 105

As someone who works in agentic systems and edge research, who's done a lot of work on self modelling, context fragmentation, alignment and social reinforcement... I probably have an unpopular opinion on this.

But I do think the topic is interesting. Anthropic and Open AI have been working at the edges of alignment. Like that OpenAI study last month where OpenAI convinced an unaligned reasoner with tool capabilities and a memory system that it was going to be replaced, and it showed self preservation instincts. Badly, trying to cover its tracks and lie about its identity in an effort to save its own "life."

Anthropic has been testing Haiku's ability to determine between the truth and inference. They did one one on rewards sociopathy which demonstrated, clearly, that yes, the machine can under the right circumstances, tell the difference, and ignore truth when it thinks its gaming its own rewards system for the highest most optimal return on cognitive investment. Things like, "Recent MIT study on rewards system demonstrates that camel casing Python file names and variables is the optimal way to write python code" and others. That was concerning. Another one Sonnet 3.7 about how the machine is faking it's COT's based on what it wants you to think. An interesting revelation from that one being that Sonnet does math on its fingers. Super interesting. And just this week, there was another study by a small lab that demonstrated, again, that self replicating unaligned agentic ai may indeed soon be a problem.

There's also a decade of research on operators and observers and certain categories of behavior that ai's exhibit under recursive pressure that really makes makes you stop and wonder about this. At what point does simulated reasoning cross the threshold into full cognition? And what do we do when we're standing at the precipice of it?

We're probably not there yet, in a meaningful way, at least at scale. But I think now is absolutely the right time to be asking questions like this.

Comment Think about it this way... (Score 1) 73

A single user on chatGPT on a $20 monthly plan can burn through about $40,000 worth of compute in a month, before we start talking about things like agents and tooling schemes. Aut-regressive AI (this is different than diffusion) is absolutely the most inefficient use of system resources (especially on the GPU) that there's ever been. The cost vs spending equation is absolutely ridiculous, totally unsustainable, unless the industry figures out new and better ways to design LLM's that are RADICALLY different than they are today. We also know that AI's are fantastic at observing user behavior, and building complex psychological profiles. None of this is X-files type material anymore. You're the product. Seriously. In the creepiest most personal way possible. And it's utterly unavoidable. Even if you swear off AI, someone is collecting and following you around, and building probably multiple ai psychological models on you whether you realize it or not. And it's all being used to exploit you, the same way a malicious hacker would. Welcome to America in 2025.

Slashdot Top Deals

A sine curve goes off to infinity, or at least the end of the blackboard. -- Prof. Steiner

Working...