Comment Re:Beowulf cluster? (Score 1) 71
Can you imagine AI designing a beowulf cluster to
... ?
$ nohup paperclip_d &
Can you imagine AI designing a beowulf cluster to
... ?
$ nohup paperclip_d &
The only solution is to prevent companies from collecting and maintaining this level of information.
Yes, this has been said by many people, in many countries over many decades. In some cases this sensative data is being kept years after a customer ceases business.
In this case I shake my head at the laziness of the security team, as credential management in Sec/Sys/Dev/Ops/CM is pretty standard, but does need the company culture that it matters, or one ends up with main admin password pairs being "admin/nimda" because so many scripts would break.
Putting in regulations, as you say, is the only way to add some swiss-cheese, but the beancounters and PHBs won't like the cost, so nothing will be done. And Marketing, they drool at the profits, so that push to "grab all the data, monetise the living jimmies out of them" in the board room is real too.
"* The prophecy becomes self-fulfilling through material concentration — as resources flow towards AGI development, alternative approaches to AI starve..."
For the love of cobol, if you are going to write a diatribe about the evils of AGI, don't use chatGPT to do your homework.
However, that said, the new site is horrible and impossible to find the real needed information like mentioned synoptic charts, or even observations for the past three days. Important information like HF MUF's (Space weather) isn't even linked any more. The radar is inaccurate and a GenZ technicolour yawn.
Plus of course many people who use it are older, and change is harder.
They'd be extra stupid if they didn't know.
Indeed. But I think they don't realise it's plonked onto Google/Bing/Ducky/Ecosia/Kagi searches for the whole world, and likely consider it's something nobody will ever see unless the (assumed incorrectly) private link is disclosed to them.
Secondly, ChatGPT always wants to get you into a conversation, it always wants you to continue interacting. After answering your question there's *always* a followup "would you like me to..."
When I did use the hallucination machine - before the uncanny valley effect really became unbearable - I tried with adding "Please do not suffix any engagement queries" and it sticks for a while, but like asking it to not use emdashes, it is an instruction that needs regular reinforcing as context grows. So you tend to need to paste your list of "Please do not (*) suffix any engagement queries (*) apologise nor say you will do better (*) use emdashes (*) use the word 'testament' (*) use purple prose (*) blow smoke up my arse... (etc)" stuff every few prompts so it's reinforced.
And finally, ChatGPT remembers everything, and I've recently come to discover that it remembers things even if you delete your projects and conversations *and* tell ChatGPT to forget everything.
It has a RAG-type memory that extracts 'facts' from your chats and it will search through those every prompt. Before I gave up on the fawn-machine I did find a section where you can edit those. Another one for reinforced preprompting "Please do not use memory unless I specifically ask" but of course that also keeps dropping off, so you do need to paste the entire preferences list every few prompts, or SmithersGPT trims it from it's context and falls back to corporate rules.
How many of those 47,000 chats had users which were actually aware that other people can read their shit?
Probably none. Although it is mentioned that chats can become public in some cases, it's not really made clear that they are also easily searchable by absolutely anyone. Add to this that most people have memories shorter than a goldfishes, so forget what personal info they put early into the thread when they do finally share their little personal echo chamber with the fawning SmithersGPT, and that all the preprompting and "memories" pulled from RAG-like storage will be infecting their shared thread as well.
But this may also be a sign of the current times. People are too used to sharing everything, from what they ate for breakfast to the size and texture of their bowel movements. So, it may well be that; they simply don't care nor understand the danger in it.
What features were they trying to even add?
Agentic clippy needs to be able to be able to use all features of your PC not just access all data.
On top of this, a lot of automated tasks can rely on stored passwords for backups, scripts, crons and the like. If he's reset passwords, then doing this could have a lot of broken system houskeeping out of the picture too, and depending on the amount of shell-script spaghetti that has grown over the years, this can be tough to find.
As long as it comes wrapped in a cuddly Teddy Bear shell, these parents seemingly lose all sense of safety.
"A mind is a terrible thing to have leaking out your ears." -- The League of Sadistic Telepaths