- Three years ago, on January 13, 2011, we humans were dethroned by a computer on the quiz show "Jeopardy!". A year later, a computer was licensed to drive cars in Nevada after being judged safer than a human. What's next? Will computers eventually beat us at all tasks, developing superhuman intelligence?
I have little doubt that this can happen: our brains are a bunch of particles obeying the laws of physics, and there's no physical law precluding particles from being arranged in ways that can perform even more advanced computations. But will it happen anytime soon? Many experts are skeptical, while others such as Ray Kurzweil predict it will happen by 2030. What I think is quite clear is that if it happens, the effects will be explosive: as Irving Good realized in 1965, machines with superhuman intelligence could rapidly design even better machines. Vernor Vinge called the resulting intelligence explosion The Singularity, arguing that it was a point beyond which it was impossible for us to make reliable predictions.
Will there be a singularity within our lifetime? And is this something we should work for or against? On one hand, it could potentially solve most of our problems, even mortality. It could also open up space, the final frontier: unshackled by the limitations of our human bodies, such advanced life could rise up and eventually make much of our observable universe come alive. On the other hand, it could destroy life as we know it and everything we care about - there are ample doomsdays scenarios that look nothing like the Terminator movies, but are far more terrifying.
I think it's fair to say that we're nowhere near consensus on either of these two questions, but that doesn't mean it's rational for us to do nothing about the issue. It could be the best or worst thing ever to happen to humanity, so if there's even a 1% chance that there'll be a singularity in our lifetime, I think a reasonable precaution would be to spend at least 1% of our GDP studying the issue and deciding what to do about it. Yet we largely ignore it.2