Comment Re:People hate GenAI because it *doesn't* suck... (Score 0) 58
These sorts of theatrics exactly.
These sorts of theatrics exactly.
...not because it does.
It's all about gatekeeping the skill, time, and budget floor and propping up the wall between "producers" and "consumers". I worked hard to get where I am, therefore it shouldn't be made easier. I worked hard to make $20/hour, therefore we shouldn't raise the minimum wage. Etc.
The reason there aren't legions and legions of programmers protesting AI on twitter is because programmers are accustomed to change and we've learned to embrace it, yet every time there's a technology that changes how art is made (cameras, digital painting, 3d rendering, even pre-made pigments), there are a group of artists who flip their shit and say that the new technology is going to kill creativity and ruin art as we know it, then fifty years later all of the things that those people insisted are "not art" are in museums and art history books.
I hope a company in China gets ahold of the database and trains a good music generator AI on it and releases it for free.
Smart devices have been spying on you for years now. Don't buy a fridge that spies on you.
...who for some reason deserve to be protected from automation more than fucking literally everybody else, apparently.
AI doesn't "siphon up and launder" code any more than your brain does. Both *can* memorize (which is why sometimes comedians can accidentally steal a joke), but both are also learning from patterns.
AI isn't going to make people stop contributing to open source.
a.) There's little interest in interrogating the downsides of generative AI, such as the environmental impact, the data theft impact, the treatment and exploitation of data workers.
That's all the press ever fucking talks about, to the point where you've got people who use the cloud for everything bitching about AI like the rest of their cloud use isn't impacting the environment. Also, analyzing data isn't theft.
b.) There's little interest in considering the extent to which, by incorporating generative AI into our teaching, we end up supporting a handful of companies that are burning billions in a vain attempt to each achieve performance that is a scintilla better than everyone else's.
People need to learn about and use open source AI. There are plenty of very good options.
c.) There's little interest in thinking about what's going to happen when the LLM companies decide that they have plateaued, that there's no more money to burn/spend, and a bunch of them fold—but we've perturbed education to such an extent that our students can no longer function without their AI helpers.
Oh, and if all those companies crap out, open source AI is still going to exist. Those models won't magically vanish either.
Did you just pick a random person to say that to?
...probably aren't going to do their research, and will be willing to buy a shittier version for a higher price.
This camera is a fashion accessory for shallow people.
It's a good way of laying off the people who are good enough at what they do that they can find other jobs.
Found the office building real estate investor. Or the sociopath from upper management.
There have always been some critics like that, yes, but it's a lot more universal now. It wasn't nearly this bad in the 90s and 00s.
In the last fifteen years, critics have leaned more and more heavily into telling people what they ought to like, as opposed to how likely they are to like something.
Things like the Mario movie, that are enjoyable and escapist tend to get panned by the critics. Conversely, a movie like The Last Jedi that turned a formerly enjoyable, escapist series on its head (the 8th part of a 9 part series isn't the time to do that) has a critic rating of 95% versus an audience rating in the 40s, because the critics straight up don't care about the movie actually making sense in the context of the previous 7 entries (8 counting Rogue One).
Critics were, as far as I can remember, generally against the release of the Snyder cut of Suicide Squad, and also against the rework of Sonic's appearance in the Sonic movies (now widely considered to be a very wise decision by the studio).
It used to be that critics would catch the occasional good, intelligent movie that parts of the audience didn't really get, but recently what they like doesn't seem to have any correlation with what general audiences will like *or* whether a movie is intelligent (The Last Jedi was quite stupid but critics glazed it anyway).
So yeah, it's not so much that Rotten Tomatoes has brought in too many random critics, it's that today's literary "elite" enjoy fart-sniffing more than they enjoy actual entertainment.
In this kind of situation, it's smart to disallow its use until an evidence-based decision can be made about whether it will actually work and whether it performs at the level of human therapists, and any AI used for this purpose should have to go through an approval process, because not all AIs are created equal.
From my own experience:
> Was it mostly boiler plate?
A good bit of it was, but speeding that up without having to dig through a bunch of templates saves a ton of time.
> How much of the generate code was correct?
A lot of it, but I generally only use it to complete a few lines of code at once, and do the high level thinking on my own. If you use AI that way, it's a great productivity tool.
> How much of the Copilot code makes into production?
A lot of it.
I've got a bad feeling about this.