Comment Re:Fear is the appropriate response. (Score 1) 68
The problem with hallucinations is that they are typically not only a little off.
The problem with hallucinations is that they are typically not only a little off.
You think wrongly. It cannot be fixed and there is _mathematical_ proof for that. As soon as you "fix" fact-checking, an LLM loses all its power.
No, it cannot. "Fact checking" is not something machines can do at this time, with some very limited exceptions.
Why do people keep pushing this BS?
The problem is that this leaves evidence. Remember that it used to be the "Big 4"?
As Sergey and Larry learned, no better way to mark an asshole.
Careful about drinking your own koolaid. AMD's data center market share has risen from roughly zero to 5-10% with revenue growing 57% yoy.
It is cheaper and they will probably still charge an arm and a leg.
The hallucination problem _cannot_ be fixed. It is a fundamental part of the mathematical model. Getting it fixed is about as possible as making water not wet under standard conditions.
You mean like KPMG and the others usually do via human consultants?
The real problem is that this artificial moron will not know about how to hide the criminal things.
Quantum annealers cannot scale to useful sizes for this application. Hence "stunts". Actual QCs may or may not scale, but quantum annealers will never even catch up to conventional computers for this application.
Quantum annealers are useless for factorization outside of meaningless stunts.
Exactly.
Good point. Angry, aggressive, out for revenge and not thinking about the consequences for himself at all. Essentially an intellectual child.
The asshole here is YOU. I get that you do not like to be called out for your crap. I will do it anyways.
"I've got some amyls. We could either party later or, like, start his heart." -- "Cheech and Chong's Next Movie"