Comment Re:AGI... (Score 1) 110
I've made no assumptions. That's the difference between me and you religious whack-jobs.
I've made no assumptions. That's the difference between me and you religious whack-jobs.
I can see why you post AC... How pathetic.
Do you honestly believe, with more money than RELIGION being poured into it, that the human species won't figure out how to replicate a close facsimile to consciousness?
We've been at it for thousands of years. We don't even know what questions to ask.
Let me guess... I just need to have more FAITH, right? LOL! You religious nuts are all the same.
Sigh... I'm not making a semantic argument. That particular Dijkstra quote, therefore, is not relevant.
You're a bit out of your depth here. Maybe you should stick to silly science fiction.
Are they not doing music anymore for each release?
See? We knew you were completely full of shit.
Here's a clue for you: If you can't support your bullshit claims, don't make them in the first place.
I was pushing back on gweihir's assertion that the human brain is optimized for intelligence;
He never made that claim:
At the most, the human brain can do human intelligence, which typically is not impressive at all. But we do not even know whether the brain can even do that, as it does seem to be rather strongly underpowered for what smart humans can do.
He did claim that "the human brain is the most powerful computing mechanism physically possible", though I suspect that's a stronger claim than he intended to make. In any case, he does not say or imply anything that can be construed as the human brain being "optimized for intelligence".
You're giving him far, far, too much credit:
Poor heath reduces GDP more than healthcare spending offsets. "Making prudent investments in global health will not only dramatically improve people’s quality of life, it’s a $12 trillion economic opportunity, according to new research"
Such as?
Don't bother replying. We already know you're full of shit.
Again, AGI is silly science fiction nonsense. It's no more dangerous than the monsters in your closet or any other imaginary threat.
Don't worry. Reality is more that enough to stop the fictional bad guys from developing their fictional computer programs.
Do the AI doomers actually think people will listen?
It doesn't matter. AGI and ASI are silly science fiction nonsense. You might as well be worried about Godzilla attacks and moon monsters.
AGI... Will reach ASI likely on its own
Sure. And the zombie hoards will keep the alien invaders at a bay while we fight off the demon invasion.
Don't confuse bad science fiction for reality. AGI does not exist. Current LLMs are very obviously not a step along the path to their development. Don't be absurd.
Imagine a Beowulf Cluster of Trump clones
That could be trivially simulated on an Apple 2 with 4K of RAM.
Recreating a human brain with transistors would be possible
There is no evidence that suggests that it is possible to recreate a human brain with transistors.
You've confused your science fiction fantasy for reality. Replace 'transistors' with 'clockwork' and you'll, hopefully, see how ridiculous you sound.
But is he wrong?
Absolutely. No question.
If we assume the current hominid brain is
Baseless assumptions are why he and you are, without question, completely and unequivocally wrong.
We simply don't know enough about to problem to make any meaningful statements. We don't even know what questions to ask.
Prediction is very difficult, especially of the future. - Niels Bohr