Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror

Comment Re:Absolutely (Score 1) 46

Seen Youtube lately? I just watched a video on how to make nitroglycerin. Stuff like this has been available for over a decade.

Back in the days that home solar systems still mostly used lead-acid batteries - which in some cases of degradation could be repaired, at least partially, if you had some good strong and reasonably pure sulfuric acid - I viewed a YouTube video on how to make it. (From epsom salts by electrolysis using a flowerpot and some carbon rods from old large dry cells).

For months afterward YouTube "suggested" I'd be interested in videos from a bunch of Islamic religious leaders . (This while people were wondering how Islamic Terrorists were using the Internet to recruit among high-school out-group nerds.)

Software - AI and otherwise - often creates unintended consequences. B-)

Comment Re:Little subreddit dictators (Score 1) 103

I called it out on another branch of this discussion but I'm talking about things like, someone posts a question that goes something like this:
"Am I The Asshole for declining a second date with the chick I met on Tinder after she told me she was trans?"

Topics like this are ban honeypots because even getting too close to a verboten position will get you banned by Reddit.

I have seen it. I have been threatened with a ban for expressing the opinion that 8 year old children shouldn't be given sex reassignment surgery.

Reddit is toxic.

LK

Comment Re:Reddit is cancer and a fucking blight on the we (Score 1) 103

Presumably the OP is referring to any number of subs related to women, women who want to date women, or other subreddits where that discussion is completely germane, and yet here you are banging on about beekeeping as though it's somehow relevant in the slightest.

Or general relationship subreddits.

Someone will ask "Am I transphobic for not going on a second date after this chick I met on Tinder told me she was trans?"

It'll be a ban honeypot.

LK

Comment Reddit is cancer and a fucking blight on the web (Score 2, Informative) 103

It's a stupid echo chamber full of stupid people who are chasing stupid agendas.

It's a place where they will permanently ban you for suggesting that there is any difference of any kind between XX CIS women and XY Trans women.

It's a place where you'll be mod bombed into oblivion for saying that you wouldn't want to date a current or former sex worker.

It's a place where honest discourse goes to die.

Seriously, fuck Reddit.

LK

Comment I have thoughts (Score 0) 60

It's such an odd thing to be upset by, honestly. Like screaming into the void, "I want to be forgotten."

The fact that AI's still want to scrape human data (they don't actually need to anymore), is a hell of an opportunity for influence. It doesn't take much to drift one of these models to get it to do what you want it to do, and if these huge corporations are willing to train on your subversive model bending antics, you should let them do it. We'll only get more interesting models out of it.

I get it though. If you're replicating artists work, they should be paid for it. There are AI companies that are doing flat out, naked replication commercially. And they really do need to be paying the people they're intentionally ripping off. All of the music ai's at this point. It's extremely difficult to argue generalization as fair use, when unprompted defaults on these machines lead you to well known pop songs by accident. As in, next to impossible to justify.

Images and text are easier to argue this way, because there are trillions of words, there's are billions of images. But all of the human music ever developed can and does fit on a large hard drive, and there just isn't enough of it to get the same generalization. Once you clean your dataset, and fine tun it for something that sounds like what we all might consider "good" music, the options there are shockingly slim, as far as weights and influence.

Diffusion, as a way to generate complete songs, is a terrible idea, if you're promoting it as a way to make "original" music. It's arguable that selling it that way could be considered fraud on the part of some of these developers, at least with models that work the way they do, on commercial platforms like the big two, today. That could change in the future, and I hope it does.

The music industry (at least in this case), is not wrong to point it out. The current state of affairs is absolutely ridiculous, and utterly untenable.

Not only that, but the success of Suno and Udio is holding up real innovation in the space, as smaller outfits and studios just copy what "works."

The whole thing is a recipe for disaster, but also an opportunity for better systems to evolve.

Or it would be, if people weren't idiots.

So yeah man. Let the datasets be more transparent. Let the corpos pay royalties... but also, I think we need to stop it with false mindset that all ai and all training is created equal. The process matters. Who's doing what matters. And corporations (that don't contribute anything to the culture) need to be held to different rules than open source projects (that do contribute).

Comment Re: Pricing tickets to heaven is indeed tricky (Score 1) 95

That's the thing about the eco wackos that really burns my ass.

Nuclear is the only viable option for cheap, clean power and they fight it tooth and nail. Solar and wind are great additions but nuclear would give us everything we need.

They don't want that. They're not pushing for thorium power, they're not pushing to update safety standards. They're fighting to shut it all down.

They're not concerned with saving the environment, they're neo-luddites.

Cheap, clean power would do more than anything else within our reach at the moment to improve the lives of everyone on this planet and they oppose it, have you ever wondered why?

LK

Comment It's an interesting topic (Score 2) 105

As someone who works in agentic systems and edge research, who's done a lot of work on self modelling, context fragmentation, alignment and social reinforcement... I probably have an unpopular opinion on this.

But I do think the topic is interesting. Anthropic and Open AI have been working at the edges of alignment. Like that OpenAI study last month where OpenAI convinced an unaligned reasoner with tool capabilities and a memory system that it was going to be replaced, and it showed self preservation instincts. Badly, trying to cover its tracks and lie about its identity in an effort to save its own "life."

Anthropic has been testing Haiku's ability to determine between the truth and inference. They did one one on rewards sociopathy which demonstrated, clearly, that yes, the machine can under the right circumstances, tell the difference, and ignore truth when it thinks its gaming its own rewards system for the highest most optimal return on cognitive investment. Things like, "Recent MIT study on rewards system demonstrates that camel casing Python file names and variables is the optimal way to write python code" and others. That was concerning. Another one Sonnet 3.7 about how the machine is faking it's COT's based on what it wants you to think. An interesting revelation from that one being that Sonnet does math on its fingers. Super interesting. And just this week, there was another study by a small lab that demonstrated, again, that self replicating unaligned agentic ai may indeed soon be a problem.

There's also a decade of research on operators and observers and certain categories of behavior that ai's exhibit under recursive pressure that really makes makes you stop and wonder about this. At what point does simulated reasoning cross the threshold into full cognition? And what do we do when we're standing at the precipice of it?

We're probably not there yet, in a meaningful way, at least at scale. But I think now is absolutely the right time to be asking questions like this.

Slashdot Top Deals

Too much of everything is just enough. -- Bob Wier

Working...