It depends.
If someone is in active recovery, they may exactly know what they can handle at that point. For example, someone who had sexual abuse in their history knows they don't need to see someone else's depiction. That's not going to help them face and overcome it and will more likely set them back for the day. If they're going into an R rated movie, the trigger warning is right there in the rating system. That's not the case for all media.
Or, it might be that on some days they can handle it, and some days they can't. I don't always want to hear about religious trauma because I already have that T-shirt, but on other days and in other contexts I might be curious.
I appreciate trigger warnings though I don't need them anymore.
LLM-based AI can do some pretty impressive things. It *seems* to answer questions with remarkable accuracy, and it instantly produces code in response to often ridiculously vague input queries:
"Write me an app to track ant farms in Vietnam"
And what do you know? You get something that seems surprisingly useful!
Except that it's all an illusion.
I'm an experienced software developer (25 years now) and I focus on information lifecycle apps targeting workgroups and enterprise - organizations of 50+ people. As I write this, about 20,000 people are concurrently using an app I created.
Over the past year or so, I've been trying to deeply integrate AI into my workflow. It's there when I write code in VSCode, it's there when I write sysadmin/shell code, and it's there when I'm refactoring.
The more I use it, and the "better" it gets, the more frustrating I find it. It's only somewhat useful in the area that most coding projects fail: debugging.
No matter what it seems, LLM-based AI doesn't *understand* anything. It's just an ever-more-clever trickery based on word prediction. As such, it serves only as another abstraction that still must be understood and reviewed by a real person with actual understanding, or the result is untrustable, unstable, and insecure "vibe code" that is largely worthless outside of securing VC funding, which is the thing that AI perhaps does best: help unprepared people get VC funding.
You still need real people to get code you can live with, depend on, and grow with.
Who is "we"?
Why "should" a video game not be "killed"? Why "should" a company have to keep a game going into perpetuity? Ditto operating systems.
That's nice for Enterprise people. What about the average user? How does one get it? How much does it cost?...
If you sell diamonds, you cannot expect to have many customers. But a diamond is a diamond even if there are no customers. -- Swami Prabhupada