Singing the Singularity
Mike Treder
2009-07-16 00:00:00

On October 3, 2009, the fourth annual Singularity Summit will convene, this time in New York City. Among the speakers featured in the two-day event are IEET fellows Ben Goertzel and Aubrey de Grey, along with Ray Kurzweil, Anders Sandberg, Robin Hanson, Eliezer Yudkowsky, Greg Benford, and many others.

So what's it all about?

Humming the Tune



Organizers of the Summit describe the Singularity as "an 'event horizon' in the predictability of human technological development beyond which present models of the future may cease to give reliable answers, following the creation of strong AI or the enhancement of human intelligence."

Our IEET Encyclopedia of Terms and Ideas offers this definition: "The Singularity is a theorized future point of discontinuity when events will accelerate at such a pace that normal unenhanced humans will be unable to predict or even understand the rapid changes occurring in the world around them." It is assumed, usually, that this point will be reached as a result of a coming "intelligence explosion," most likely driven by a powerful recursively self-improving AI.






In his book The Singularity is Near, inventor and futurist Ray Kurzweil says:

What, then, is the Singularity? It's a future period during which the pace of technological change will be so rapid, its impact so deep, that human life will be irreversibly transformed. Although neither utopian or dystopian, this epoch will transform the concepts that we rely on to give meaning to our lives, from our business models to the cycle of human life, including death itself. Understanding the Singularity will alter our perspective on the significance of our past and the ramifications for our future.


And, finally, the Summit organizers write: "While some regard the Singularity as a positive event and work to hasten its arrival, others view the Singularity as dangerous, undesirable, or unlikely. The most practical means for initiating the Singularity are debated, as are how, or whether, it can be influenced or avoided if dangerous."

Dubbed by some, disparagingly, as a Rapture of the Nerds, the concept of the Technological Singularity is often derided as wish fulfillment for perpetually adolescent (and typically male) sci fi dreamers.

Unfortunately, such dismissive criticisms, deserved or not, tend to draw attention away from the beneficial aspects of futuristic speculation.

We'll explore those benefits, but first let's consider the downsides of 'singularitarianism', defined by one of its leading advocates as the belief that "the Singularity is possible and desirable."

Hitting the Wrong Notes



Insisting on the possibility -- or, even more strongly, asserting the inevitability -- of an uncertain and debatable but incredibly momentous event leaves proponents vulnerable to a charge that they lack rigor and discipline in their thinking, that they have fallen prey to hopefulness and shed any semblance of healthy skepticism. If they cannot restrain themselves from heartily endorsing an unprovable proposition, than what credibility have they for other declarations or recommendations they might make?

This is, of course, a familiar dilemma for many futurists (especially transhumanists like myself) who choose to speculate about matters both unclear and controversial, and who can't do what we do without risking criticism and even mockery. The fact that these areas are still hazy should not restrain us from trying to shine a light ahead into the mist, but at the same time an emphasis on clear thinking and scientific rationality is greatly advisable.

More troubling is the suggestion by some (all?) singularitarians that the outcome they seek is not only possible but desirable. Given the substantial amount of uncertainty -- which they themselves admit -- surrounding the nature and impacts of such an occurrence, it seems imprudent to stamp the Singularity as unquestionably "a good thing." Worse yet, some who proudly say they're working to bring about the Singularity have the temerity to proclaim that they alone hold the keys to making it a "friendly" event. Besides sounding childishly naive, such a claim also invokes the specter of technocracy: if only all the big issues of this world were left to the few really smart people to solve, everything would turn out fine. It's implied, moreover, that meddling democrats and pesky government regulation will only slow things down and might even prevent the smart singularitarians from saving the day.

In view of all the damage done by a self-labeled "best and brightest" during the last few centuries, it's surprising that this kind of doctrine still attracts adherents.

Those who promote the idea that a Technological Singularity is not only possible and desirable but that its advent can be hastened through our efforts must be aware of the obvious parallels between their own beliefs and those of Christian Millenarians. And in addition to courting ridicule, this "bring it on" position tempts an assumption that whatever problems we have today will be fixed soon enough, so why worry about them now? Poverty, disease, malnutrition, war, pollution, climate change -- all these will be simple matters for the superintelligence that we (or, rather, they) are about to create.

Finally, the proposal that a Singularity can be managed for "friendliness" seems hopelessly hubristic. Prometheus did indeed steal fire from the gods and we have used it in countless beneficial ways for humanity, but we have also wielded its power to kill millions of people down through the centuries and continuing today. Nearly every significant technological advance has been dual-use, providing both opportunities and threats. We have not yet learned how to tame ourselves and our impulses well enough to prevent horrible things from happening with the aid of technology, so why should we assume that we will do so in the near future?

Getting Back in Key



Having said all that, it's important for me to state that: 1) I do think it's reasonable to expect some sort of intelligence-driven discontinuity, probably before the end of this century; 2) we are well-advised to learn as much as we can about this possibility; and 3) we should seek whatever ways there might be to influence events toward a mostly positive outcome. I differ from the singularitarians in my skepticism about our chances of keeping complete control over what will happen, and I certainly try to separate myself from those who offer a blithe assurance that since future technologies will easily solve our big problems, we can comfortably ignore them today.

If, however, futurists are able to restrain themselves from unbridled techno-optimism (as well as from cynical techno-pessimism), if they can maintain a healthy skepticism toward unsubstantiated claims, if they can promote and demonstrate reliance on scientific rationality, if they will assiduously resist the siren call of technocracy, if they understand and emphasize the differences between their own work and the fantasies of religious rapturists, if they strive for humility and an honest recognition of their own human limitations, and most of all, if they will recognize that nothing is certain and that working to relieve suffering today is every bit as important as chasing the promises of future technological potential, then discussion of a Singularity and its pros and cons is a worthwhile effort.