Comment Re:It’s simple (Score 1) 70
Probably didn't help that in the craziness of post-pandemic recovery a fair number of software developers managed to swing 100% raises when the flood of covid recovery cash went into play...
Probably didn't help that in the craziness of post-pandemic recovery a fair number of software developers managed to swing 100% raises when the flood of covid recovery cash went into play...
It takes many days of coding till 4AM to become a great developer.
No, you don't need to absolutely break your lifestyle in unhealthy ways to become a great developer. Yes you need a bit more grounded experience than what universities offer, but that doesn't mean absurd lifestyle choices.
I remember back in the day "real developers" bragging about how they needed Jolt Cola and because they drank so much Jolt, you *know* they must be good... I had hoped we had gotten over that mindset..
I think a part of it is that many of those programming hires never really made any sense in the first place.
A great chunk of the programming hires were done with no good idea on how to actually utilize those people. It was more performative for the sake of investors and clients than a good use of peoples' time. So you end up with horrible bureaucracies of developers that in aggregate never got anywhere real, but to some extent that's ok because they didn't have any real idea in the first place and they can still brag about their headcount to the people that will give them money.
Now the performative thing to do is say AI over and over again. Bad news for the humans, but probably neutral for many of these companies that weren't really doing that much useful stuff with the people anyway.
To the extent things have substantially improved, it's been about convenient access to the models and better prompt stuffing to have a better shot. The models themselves are pretty underwhelming, particularly given how much had been put into trying to make them better.
The end really is that it takes me less time to get LLM generated content because I don't have to manually feed it as much of the stuff anymore, but the suggestions are still pretty garbage once I get there.
LLMs are good at generating a few lines of code sometimes, but you have to watch it like a hawk. The automatic prompt stuffing has been pretty good in reducing the effort to get to that result, which would have otherwise been more trouble than it's worth.
It can more effectively tear through material you'd find in tutorials and courses, but it's far less useful for real coding problems.
I thought I had a vibe coding scenario last week that it could do but it botched it horribly. I did use it's mess of a suggestion as some reference material to guide my reading of docs as I went about lately doing it by hand (though still using LLM code completion and tiny little prompts)
AI is a tool. And like any tool its introduction creates proponents and enemies.
Some might say I'm a semi-professional writer. As in: I make money with things I write. From that perspective, I see both the AI slop and the benefits. I love that AI gives me an on-demand proof-reader. I don't expect it to be anywhere near a professional in that field. But if I want to quickly check a text I wrote for specific things, AI is great, because unlike me it hasn't been over that sentence 20 times already and still parses it completely.
As for AI writing - for the moment it's still pretty obvious, and it's mostly low-quality (unless some human has added their own editing).
The same way that the car, the computer, e-mail and thousands of other innovations have made some jobs obsolete, some jobs easier, and some jobs completely new, I don't see AI as a threat. And definitely not to my writing. Though good luck Amazon with the flood of AI-written garbage now clogging up your print-on-demand service.
YouTube says the AI aims to "deepen your listening experience".
Right.
Yes, I guess it will. By hopefully making people switch away from YT music en mass.
The human using the LLM, obviously.
Trivially obviously not. The LLM wasn't trained on texts exclusively written by the human using it, so it won't ever speak like that particular person.
If someone wants to train a specific "Tarrof" LLM - go ahead. I'm simply advocating against poisoning the already volatile generic LLM data with more human bullshit.
the IDE never gets tired reminding you of the missing semicolon, the image AI never gets tired of highlighting relevant parts of a... colon?
Radiologists spend just 36% of their time interpreting images
Think this translates to a lot of the 'AI should replace them' positions people think about.
I'm a "software developer". There are certainly some code-heavy times, but there are days where I don't 'code' at all, and days like today where I've only messed with around 6 lines of code. A minority of my job is taken up with tasks that are even hypothetically in the discussion for AI replacement. It just so happens the LLMs tend to suck at most of my niche as well, but even if it were spot on for prose-to-code it still would only reduce a smaller fraction of my job than an outsider would guess....
That is true but also besides the point. Communicating like "a human" is the point here. WHICH human, exactly? We already have problems with hallucinations. If we now train them on huge data sets intentionally designed for the human habit of saying the opposite of what you mean, we're adding another layer of problems. Maybe get the other ones solved first?
Of course. This is a forum, not a scientific conference, so I'm not speaking in the most precise language.
It doesn't change the point.
Note that I refer to Windows Mobile, before Windows Phone 7. I consider Windows Phone 7 their first vaguely credible attempt at a mobile centric UI, and then Windows 8 the consequence of trying to throw desktop/laptop under the bus for the sake of trying to popularize their take on mobile UI. Admittedly, I was never interested in bothering to give Windows Phone 7+ a chance, but some others I knew at least made me think it was a credibly usable multi-touch UI for handhelds.
If there was a possible strategy for Microsoft to get into the mobile game, this would have been it.
Their first pass failed to really optimize for mobile at all, so you had mobile devices with clunky interfaces.
Then when they finally saw that a more targeted UI for mobile was needed, they went the other way, screwing up desktop by trying to make it look like their vision of a mobile OS, all while having the phones still unable to use monitors so there wasn't really any 'synergy' between the platforms despite throwing the desktop experience under the bus.
Now I've seen samsung and motorola phones drive desktops, but good luck which ones actually support displayport alt-mode on the usb-c.... However not *too* much of a loss because they both have just utterly shitty window managers with no options to swap it out for anything vaguely more capable despite a plethora of options in the space.
I think Android has at least got *some* of the message with respect to applications, carrying over from the ChromeOS support for linux applications, however it was sad that even as a pure linux person who uses desktop linux without a hint of Windows, I actually thought using Linux under ChromeOS was even worse than WSL.
Now if by some miracle, I can have an Android phone with displayport that will let me run Plasma desktop in a normal way, they can take my money so fast. Of course, realistically speaking, they'll have like 3 or 4 people excited for that and wonder why they wasted money pandering to us..
While it may certainly reduce any sympathy, 'losing money' is still an apt term.
If I intercepted one of your paychecks, I think you'd fairly say that you 'lost money', despite having, presumably, some savings.
Now if you are a billionaire bemoaning losing a few thousand, I'm not going to be terribly moved by your plight, but I would still permit the phrase 'lost money'.
But the phrasing should be about losing money, not "losing" cars.
"Live or die, I'll make a million." -- Reebus Kneebus, before his jump to the center of the earth, Firesign Theater