Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror

Comment it's a tool like any other tool (Score 1) 39

AI is a tool. And like any tool its introduction creates proponents and enemies.

Some might say I'm a semi-professional writer. As in: I make money with things I write. From that perspective, I see both the AI slop and the benefits. I love that AI gives me an on-demand proof-reader. I don't expect it to be anywhere near a professional in that field. But if I want to quickly check a text I wrote for specific things, AI is great, because unlike me it hasn't been over that sentence 20 times already and still parses it completely.

As for AI writing - for the moment it's still pretty obvious, and it's mostly low-quality (unless some human has added their own editing).

The same way that the car, the computer, e-mail and thousands of other innovations have made some jobs obsolete, some jobs easier, and some jobs completely new, I don't see AI as a threat. And definitely not to my writing. Though good luck Amazon with the flood of AI-written garbage now clogging up your print-on-demand service.

Comment Re:Ok ? But who's going to host it ? (Score 3, Interesting) 29

Not only that, but people are relying on third-parties to keep the data available 24/7/365 until the end of time. I can tell you for a fact that if a company goes under, so does your data. I had an SVN repository hosted by a third-party. The company went tits-up and al of that data is now gone. There might be a backup of it somewhere but it's inaccessible to the company's customers. This is really no different that relying on some physical media to store data. Long gone are 7-inch, 5-inch, 3.5 inch floppy disks. Gone are Syquest disks. Gone are magneto-optical disks. Gone are Zip drives. Gone are magnetic tape drives of bunchteen flavors. CD-ROMs are probably still readable... if you can find a drive for them. Compact Flash probably still work. SD cards and Micro SD cards, plenty of those around... if you can remember what was on them because you can't easily label them. Oops, did you roll over one with your desk chair? Sayonara. External hard drives? Oh, did it use some long-dead interface like SCSI? Heh. And the drive is also hopelessly stuck. Not to worry though. Most of that data wasn't important anyway.

Comment Re: does it, though? (Score 1) 244

The human using the LLM, obviously.

Trivially obviously not. The LLM wasn't trained on texts exclusively written by the human using it, so it won't ever speak like that particular person.

If someone wants to train a specific "Tarrof" LLM - go ahead. I'm simply advocating against poisoning the already volatile generic LLM data with more human bullshit.

Comment Re: does it, though? (Score 1) 244

That is true but also besides the point. Communicating like "a human" is the point here. WHICH human, exactly? We already have problems with hallucinations. If we now train them on huge data sets intentionally designed for the human habit of saying the opposite of what you mean, we're adding another layer of problems. Maybe get the other ones solved first?

Comment does it, though? (Score 1) 244

"We Politely Insist: Your LLM Must Learn the Persian Art of Taarof"

While that might be an interesting technical challenge, one has to wonder why. Just because something is "culture" doesn't mean it should be copied. Slavery was part of human culture for countless millenia. To the point where we haven't even gotten around to updating our "holy books" that all treat it as something perfectly normal. That's how normal slavery used to be.

(for the braindead: No, I'm not comparing Taarof to slavery. I'm just making a point with an extreme example.)

The thing is something called unintended consequences. So in order to teach an LLM Taarof you have to teach it to lie, to say things that don't mean what the words mean. And to hear something different from what the user says. Our current level of AI already has enough problems as it is. Do we really want to teach it to lie and to misread? Just because some people made that part of their culture?

Instead of treating LLMs like humans, how about just treating them as the machines they are? I'm pretty sure the Persians don't expect their light switches to haggle over whether to turn on the light or not, right? I stand corrected if light switches in Iran only turn on after toggling them at least three times, but I don't think so. In other words: This cultural expectation only extends to humans. Maybe just let the people complaining know that AIs are not actually human?

Slashdot Top Deals

"Tell the truth and run." -- Yugoslav proverb

Working...