« A Democratic Contract for America | Main | Off to London »

"The Singularity Is Near"

My seventh book of the year (I'm still catching up from a couple of months ago) was Ray Kurzweil's The Singularity Is Near: When Humans Transcend Biology.

I've heard it said that anyone in high technology has to read this book -- that Kurzweil's arguments now come up so often in discussion that to be literate in our field, one has to be conversant with them. I tend to go along with that theory. Kurzweil makes dramatic claims about the future of technology and backs them up with 500 pages of charts and citations. We can't afford not to read what he has to say, debate it, and think about its implications for our future.

The key idea underlying the impending Singularity is that the pace of change of our human-created technology is accelerating and its powers are expanding at an exponential pace...

This book will argue... that within several decades information-based technologies will encompass all human knowledge and proficiency, ultimately including the pattern-recognition powers, problem-solving skills, and emotional and moral intelligence of the human brain itself...

The Singularity will represent the culmination of the merger of our biological thinking and existence with our technology, resulting in a world that is still human but that transcends our biological roots. There will be no distinction, post-Singularity, between human and machine or between physical and virtual reality.

Is what Kurzweil is saying true? I don't know. The statistics he cites in the book on exponential growth -- in microprocessor cost and performance, DNA sequencing cost, the decrease in size of mechanical devices, resolution and speed of brain scanning, and many more -- are undeniable. The question is, where are those trends leading us? Will a machine pass the Turing test by 2029? Once intelligent, will machine intelligence increase exponentially? Will humans augment their biological intelligence with machine intelligence? Kurzweil believes all this will happen, and has a schedule for it, based on extrapolating the exponential growth curves he cites.

If I had to guess, I'd say that Kurzweil is on the right track, but his dates might be off. He believes that once we have low-cost computers with raw processing power equal to that of the human brain, and with a deep understanding of the brain's "architecture" in hand thanks to neuroscience advances, it won't take long for human-level intelligence to develop in machines. My hunch is that it will take longer than he thinks. For one thing, software development is much less predictable than hardware development. For another, even with the necessary hardware and software at our disposal, we will have to teach our would-be intelligent machines about the world. That process could turn out to be time-consuming. It might be that, at first, the only way to effectively bring about a human-equivalent intelligence will be to create a physical entity and allow it to explore and experience the world around it, just as we do with human children. This process alone could take years, and we might get it wrong many times before we get it right.

But agree or disagree with him, Kurzweil can't simply be dismissed. He makes a comprehensive case for his beliefs, and if his forecasts come to pass -- on whatever schedule -- they will change our world more profoundly than anything since the development of language and tool-making.

TrackBack

TrackBack URL for this entry:
/cgi-bin/mt-tb.cgi/874

Post a comment