Navigation

 ·   Wiki Home
 ·   Wiki Help
 ·   Categories
 ·   Title List
 ·   Uncategorized Pages
 ·   Random Page
 ·   Recent Changes
 ·   RSS
 ·   Atom
 ·   What Links Here

Active Members:

Search:

 

Create or Find Page:

 

View Singularitarianism

Singularitarianism is a moral philosophy based upon the belief that a technological Singularity a theoretical future point that takes place during a period of accelerating change after the creation of a superintelligence is possible, and advocating deliberate action to bring such an entity into effect and ensure its safety.

While many futurists and transhumanists speculate on the possibility and nature of this technological development, Singularitarians believe it is not only possible, but desirable if, and only if, guided safely. Accordingly, some dedicate their lives to acting in ways they believe will contribute to its safe implementation.

The term “Singularitarian” was originally defined by Extropian Mark Plus in 1991 to mean “one who believes the concept of a Singularity.” This term has since been redefined to mean “Singularity activist” or “friend of the Singularity”; that is, one who acts so as to bring about the Singularity.
Ray Kurzweil, the author of the book The Singularity is Near, defines a Singularitarian as someone “who understands the Singularity and who has reflected on its implications for his or her own life”.

In his 2000 essay, “Singularitarian Principles”, Eliezer Yudkowsky writes that there are four qualities that define a Singularitarian:

*A Singularitarian believes that the Singularity is possible and desirable.
*A Singularitarian actually ‘‘works’’ to bring about the Singularity.
*A Singularitarian views the Singularity as an entirely secular, non-mystical process — not the culmination of any form of religious prophecy or destiny.
*A Singularitarian believes the Singularity should benefit the entire world, and should not be a means to benefit any specific individual or group.

Whereas transhumanism is sometimes signified by the symbol H+, S^ can be used to denote Singularitarianism.

Singularitarianism is presently a small movement, although many believe a technological Singularity is possible without adopting Singularitarianism as a moral philosophy.

In June 2000 Eliezer Yudkowsky, Brian Atkins and Sabine Atkins founded the Singularity Institute for Artificial Intelligence to work towards the creation of self-improving Friendly AI. Singularitarians believe that reaching the Singularity swiftly and safely is the best possible way to minimize net existential risk.

With the support of NASA, Google and a broad range of technology thought leaders and entrepreneurs, Kurzweil’s Singularity University is scheduled to open in June 2009 at the NASA Research Park in Silicon Valley with the goal of preparing the next generation of leaders to address the challenges of accelerating change.

Many mainstream critics ridicule the Singularity as “the Rapture for nerds,” and have dismissed Singularitarianism as a pseudoreligion of fringe science. Singularitarians point out the many differences between the idea of the Rapture and Singularitarianism.

IEET Links:
Waiting for the Great LeapForward
Millennial Tendencies in Responses to Apocalyptic Threats
Libertopian Doublethink on the Singularity

Sources:
Wikipedia on Singularitarianism
Black Belt Bayesian: Rapture of the Nerds, Not

Category://Encyclopedia

Categories: