Navigation

 ·   Wiki Home
 ·   Wiki Help
 ·   Categories
 ·   Title List
 ·   Uncategorized Pages
 ·   Random Page
 ·   Recent Changes
 ·   RSS
 ·   Atom
 ·   What Links Here

Active Members:

 ·  jhughes

Search:

 

Create or Find Page:

 

View Singularity

The Singularity is the theoretical future point which takes place during a period of accelerating change sometime after the creation of a superintelligence.

I. J. Good first wrote of an “intelligence explosion,” suggesting that if machines could even slightly surpass human intellect, they could improve their own designs in ways unforeseen by their designers, and thus recursively augment themselves into far greater intelligences. The first such improvements might be small, but as the machine became more intelligent it would become better at becoming more intelligent, which could lead to an exponential and quite sudden growth in intelligence.

Vernor Vinge later called this event “the Singularity” as an analogy between the breakdown of modern physics near a gravitational Singularity and the drastic change in society he argues would occur following an intelligence explosion. In the 1980s, Vinge popularized the Singularity in lectures, essays, and science fiction. More recently, some prominent technologists such as Bill Joy, founder of Sun Microsystems, voiced concern over the potential dangers of Vinge’s Singularity.

Futurist Ray Kurzweil argues that the inevitability of a technological Singularity is implied by a long-term pattern of accelerating change that generalizes Moore’s Law to technologies predating the integrated circuit, and which he argues will continue to other technologies not yet invented.

The term"Singularity” has come to be used to mean many different things to different people. Michael Anissimov has noted that given 50 different people they will likely have 50 different ideas of what the Singularity is, and that the ideas of exponential growth, radical life extension, mind uploading, the feasibility of Artificial general intelligence, and Transhuman intelligence have become conflated. Michael Anissimov also writes that the Singularity is just about smarter than human intelligence .


Accelerating change
Some Singularity proponents argue its inevitability through extrapolation of past trends, especially those pertaining to shortening gaps between improvements to technology.

Ray Kurzweil’s analysis of history concludes that technological progress follows a pattern of exponential growth, following what he calls The Law of Accelerating Returns. He generalizes Moore’s Law, which describes geometric growth in integrated semiconductor complexity, to include technologies from far before the integrated circuit.

Whenever technology approaches a barrier, Kurzweil writes, new technologies will cross it. He predicts paradigm shifts will become increasingly common, leading to “technological change so rapid and profound it represents a rupture in the fabric of human history.” Kurzweil believes that the Singularity will occur before the end of the 21st century, setting the date at 2045. His predictions differ from Vinge’s in that he predicts a gradual ascent to the Singularity, rather than Vinge’s rapidly self-improving superhuman intelligence. This leads to the conclusion that an artificial intelligence that is capable of improving on its own design is also faced with a Singularity.

IEET Links:
Hughes interviewed on Singularity and AI
Hughes on Regulating AI at the Singularity Summit
George Dvorsky’s take on the Singularity
Aubrey de Grey’s talk on Longevity and the Singularity

See also:
Singularitarianism

Sources:
The Word"Singularity” Has Lost All Meaning
The Singularity is Just About Smarter than Human Intelligence
Wikipedia on the Technological Singularity

Category:Encyclopedia

Categories: