There are three ``flavors'' of the Singularity (link): Accelerating Change is advocated by Kurzweil, with the exponential increases leading to powerful technology where we can do full-brain emulation; Event Horizon is advocated by Vernor Vinge, where we are likely to either significantly improve human intelligence or create a new one, but then all bets are off for the future after that (``if you could predict a Grandmaster's moves in chess, you would be at least as smart as the Grandmaster''); the third type is Intelligence Explosion, advocated by Eliezer Yudkowsky, with the claim that when we create a recursively modifying intelligence, one of the things we can expect it to do is to improve its own intelligence and enter a positive feedback cycle that goes FOOM, ``like a chain of nuclear fissions gone critical.''
For the third type, basing the AI on the human brain isn't required or even desired. Rather, a mathematical model for how intelligence works and how it can improve itself in a stable way is being worked on. Nevertheless, the human brain is fairly complex. But complexity is not an argument. In 1000 years, do these critics really believe no one will have figured out the brain? Look at what understanding science has given us in just the last 400 years! In 1000 years, only a catastrophic global extinction or severe extermination would make me consider the possibility of not reaching a point of intelligence improvement. At the very least, we should be able to trivially increase the speed of our brain neurons within these next few years. They only fire at around 60 Hz.
No One Knows What Science Doesn't Know. Just because mainstream scientists haven't figured out consciousness yet doesn't mean it's an inherently irreducibly complex phenomenon. Either it's an illusion, and a poorly defined one at that (I can't seem to find agreements for what consciousness itself is), or it's something physical that exists in the structure of our brains (but not in, say, the brains of dolphins?). As I already said, complexity is not an argument. Go back two hundred years and take the now well-established fundamental laws of quantum mechanics with you. You'll either get remarks like ``That's not true'', or ``Many-Worlds? That's too complex! Reality works with a singular global world!'' 200 years ago, Einstein still hadn't thrown out the intuitive notion of time and space we have. People would think you're crazy. In addition to QM, I believe I mentioned in a previous memo that once when Vitalism was popular, its advocates claimed that the phenomenon of life was forever beyond the reach of science, so complex it was. Look where we are now.
Back to the third flavor of the Singularity: programming a Friendly AI. All that is missing is a specific piece of knowledge; the hardware we have now would be sufficient to run the software. For all we know, someone could have figured it out in a bunker already, screwed up with the Friendliness part, and in a week the AI will have solved the protein folding problem, will order some DNA synthesizing through internet services (which apparently have around 72 hour turn around times), and suddenly it will have full-blown molecular nanotechnology that it can use for whatever its purpose is. (Say, transforming all available matter in smiley faces because its human programmers told it to value ``happy humans'', where ``happy'' was interpreted as ``smiling''.)
This Horgan guy amazes me with his vehemence. As I see it, we should have less string theorists trying to be the next Einstein and more smart people working on the Friendly AI problem. Intelligence is more powerful than any technology we have yet created through intelligence. Our own intelligence is a hack, thrown together by evolution. I think we can do better.
As a parting remark, he tries to claim belief in the Singularity is on par with a religious belief. Right, just like Atheism is! I can take a joke, such as ``Rapture of the Nerds'', but people then seem to use that as an actual argument. It's not, and really, the comparison isn't there at all. The Singularity movement has rational, very smart people behind it, unlike religious leaders who ask for ignorance. The Singularity is entirely natural, no supernatural deities or phenomena have to be invoked. The Singularity isn't certain, and any believer should be worried about potential existential risks that could very well happen before or after (thanks to anything from nuclear to nanotechnological war, to a bad AI design). The Singularity is human-caused, we're not waiting around for some external being to do it for us. Unlike the rapture, a successful Singularity isn't one that will benefit just the nerds who believed in it. The rapture puts forth the idea that only good Christians are going to benefit, the rest of us be damned. The Singularity is not a revenge fantasy, it's an idea for improving the human condition. There are no religious ceremonies attached (just like there are no religious ceremonies attached to being an Atheist). Lastly, only those who haven't read up on the subject believe we'll create an AI that thinks remarkably similar to humans, just like characterizations of God appear. No, mind design space is huge. We're likely to get a very alien mind. The important part is plucking one out of the design space that is friendly toward humans, and as a valid criticism toward the Singularity people, not enough are focusing on this problem. Greater intelligence does not automatically transform into greater compassion.
Posted on 2010-05-17 by Jach