I first stumbled upon Ray Kurzweil’s website some years ago. It immediately turned me off with words like “Fantastic Voyage: Live Long Enough to Live Forever.” It sounded like one of those “How to be happy and get everything you want” books, which rarely accomplish anything but big bucks for the author, and disappointment for the suckers who buy them.
I’ve read The Singularity Is Near now, and my opinion about the book is mixed. I don’t remember what made me look it up in the first place — but I certainly don’t regret reading it.
Kurzweil argues that evolution is an exponential process, and that each new paradigm opens the door towards a faster development of the next one. As evidence, he shows logarithmic plots such as this:
If this were plotted linearly, most of the “interesting” events (like Homo Sapiens, cities, and the Internet) would be grouped together in a small chunk of recent time, compared to the time it took for life before that to evolve.
For Kurzweil, slow biological evolution is not the end of things. Technological evolution, as seen today, is following the same pattern of exponential growth, and as one paradigm reaches the point where it can no longer be improved (vacuum tubes), a new one emerges to provide cheaper, faster, smaller, safer, and more energy-efficient solutions (transistors). The author lists countless examples of the same sort, and expresses optimism that no Moore’s Law will stop us, since we will always come up with something new and better. (This applies not only to computer hardware, but also to the resolution of brain scanning, the speed/cost of genome sequencing, and so on.)
The author goes on to approximate the computational capacity of the human brain, and by extrapolating current trends, he says machines will reach that point some time in the 2020s or 30s. Kurzweil believes that there is nothing magical about the human mind, no “divine spirit,” and nothing stopping us from creating machine brains some day. He further postulates that biological intelligence will merge with machine intelligence, and that humankind will thus transcend biology. Once we learn how to build machines that emulate human intelligence, they will be no less “human” than a biological human, he states.
The Singularity, then, is the point on the exponential curve beyond which growth appears to be almost vertical. In other words, things will change at rates unimaginable to our current selves. Machines will improve themselves and the improved machines will in turn improve themselves and so on. Wait and see :)
The book goes on by analyzing three parallel “revolutions” that are happening right now, shaping the course of science and technology, and opening the way for the Singularity in the following decades. They are genetics, nanotechnology, and strong (human-level) artificial intelligence. Here are some points that I thought were interesting:
- The human DNA, compressed, is about 100MB. Uncompressed, it’s about 800MB. The human brain has on the order of 100 billion neurons. The orders of magnitude are sufficiently different to prove that the brain is not hard-coded in the DNA — it is a self-organizing mechanism.
- Technology at the nanometer scale is not only possible in theory, but already used in many fields. Our biological cells are the ultimate proof that technology at this scale can be made to work.
- The AI winter is a myth. Digital cameras can detect faces, computers can recognize text and voice, and planes take off and land with minimal human supervision. We haven’t reached strong AI yet, but narrow artificial intelligence is all around us.
- The path to the Singularity promises many benefits (prolong life, cure cancer, solve world hunger, expand human intelligence). Along with these come equally potent dangers (nanobots reproducing uncontrollably, nano-engineered diseases, unfriendly AI). Kurzweil maintains that we are capable (or the future “us” will be capable) of dealing with these perils. He offers this brilliant analogy: there are all kinds of dangers on the Internet, but few people would argue that the Internet should be taken down because of them. As far as Kurzweil is concerned, the Singularity is inevitable. Short of a 1984-like scenario where all progress is stopped, or our simulation being turned off, we are climbing on an exponential ladder, and the best we can do is to prepare ourselves for what’s coming next.
- As technology advances, fundamentalist voices will get louder and louder. Most people will not readily embrace food that’s grown in a lab, or artificial red blood cells, or nanobots in their brains. Fundamentalism comes from a desire to keep things as they are. The author claims that technology will ultimately prevail through the benefits it brings.
- All objects, ideas, and humans are essentially patterns of information, and information is ultimately the only valuable thing. Information is not lost as long as someone cares about it.
For the most part, I think Kurzweil’s points are valid. I have some qualms about his approach, which I will detail below, but for now I highly recommend the book. It is about 500 pages, with an additional 100 pages of notes. Most of the notes are references to other works, and they helped convince me that Kurzweil is not just some crazy guy who wants to upload his brain and live forever. Brilliant minds such as Richard Feynman, John von Neumann, and countless others whose names didn’t jump out at me, have also grasped the concepts of exponentially accelerating change, and the promise of genetics, nanotechnology, and artificial intelligence.
The book made me aware that certain technologies were more real than I thought, and it helped me see the future in a different light. I am more glad than ever that I live today, and not in some other century. I think our generation is going to see some pretty amazing things in the decades to come.
Now for the criticism:
- I think the book could be a lot shorter. Some parts were repetitive, and I would have preferred a more focused approach. There were a lot of sections about current research projects, and I do not see the point in presenting so many of them in a book about the future — since research projects come and go all the time.
- Ray Kurzweil makes a great job of explaining the exponential trends in evolution, but when he thinks about the future, his vision is blunted by the present. No one predicted the Internet, or any other world-changing technology. That’s why they’re so amazing in the first place! So why assume there won’t be more of them in times to come? I think exponential growth is a great model, but the only certain thing it tells us is that things will change, and fast.
- The book argues that machines exhibiting artificial intelligence will be no less human than a biological human. OK, but is that enough? In the past, we have been very good at bending ethics and humaneness to bring about great destruction (the Holocaust, the colonization of the Americas), so what about our machines?
- I’m not that excited about genetically modified food. I don’t know to what degree that is a result of anti-GM propaganda or romanticism for the “old” natural world. I want more proof that GM food is safe. Does that make me a fundamentalist? I’m also not too happy with the way intellectual property works nowadays, nor with the government violating privacy “in the interest of safety.” Overall Kurzweil is much more optimistic about these issues than I am.
What do you think? Is the Singularity the dream of a maniac or the destiny of the Universe? Is the world moving in the right direction or is it tumbling down to Hell? Is there a point in trying to predict and make sense of our future?
“Will robots inherit the earth? Yes, but they will be our children.” (Marvin Minsky)