Technoprogressive? BioConservative? Huh?
Quick overview of biopolitical points of view



UPCOMING EVENTS: Richard Loosemore



MULTIMEDIA: Richard Loosemore Topics




Subscribe to IEET Lists

Daily News Feed

Longevity Dividend List

Catastrophic Risks List

Biopolitics of Popular Culture List

Technoprogressive List

Trans-Spirit List









Richard Loosemore Topics




The Maverick Nanny with a Dopamine Drip: Debunking Fallacies in the Theory of AI Motivation

by Richard Loosemore

My goal in this article is to demolish the AI Doomsday scenarios that are being heavily publicized by the Machine Intelligence Research Institute, the Future of Humanity Institute, and others, and which have now found their way into the farthest corners of the popular press. These doomsday scenarios are logically incoherent at such a fundamental level that they can be dismissed as extremely implausible - they require the AI to be so unstable that it could never reach the level of intelligence at which it would become dangerous.  On a more constructive and optimistic note, I will argue that even if someone did try to build the kind of unstable AI system that might lead to one of the doomsday behaviors, the system itself would immediately detect the offending logical contradiction in its design, and spontaneously self-modify to make itself safe.



The Fallacy of Dumb Superintelligence

by Richard Loosemore

This is what a New Yorker article has to say on the subject of “Moral Machines”: “An all-powerful computer that was programmed to maximize human pleasure, for example, might consign us all to an intravenous dopamine drip.”



Why an Intelligence Explosion is Probable

by Richard Loosemore

(Co-authored with IEET Fellow Ben Goertzel) There is currently no good reason to believe that once a human-level AGI capable of understanding its own design is achieved, an intelligence explosion will fail to ensue.  A thousand years of new science and technology could arrive in one year. An intelligence explosion of such magnitude would bring us into a domain that our current science, technology and conceptual framework are not equipped to deal with; so prediction beyond this stage is best done once the intelligence explosion has already progressed significantly.

Full Story...



The Lifeboat Foundation: A stealth attack on scientists?

by Richard Loosemore

It turns out that the Lifeboat Foundation (and this is a direct quote from its founder, Eric Klien) is “a Trojan Horse” that is (here I interpret the rest of what Klien says) designed to hoodwink the people recruited to be its members.

Full Story...



Don’t let the bastards get you from behind!

by Richard Loosemore

One day when I was a young teenager, living out in the countryside in the south of England, a dear old guy I knew drove past me when I was on a long solitary walk. He recognized me and pulled over to ask if I wanted a ride down to the village.

Full Story...

HOME | ABOUT | FELLOWS | STAFF | EVENTS | SUPPORT  | CONTACT US
SECURING THE FUTURE | LONGER HEALTHIER LIFE | RIGHTS OF THE PERSON | ENVISIONING THE FUTURE
CYBORG BUDDHA PROJECT | AFRICAN FUTURES PROJECT | JOURNAL OF EVOLUTION AND TECHNOLOGY

RSSIEET Blog | email list | newsletter |
The IEET is a 501(c)3 non-profit, tax-exempt organization registered in the State of Connecticut in the United States.

Contact: Executive Director, Dr. James J. Hughes,
Williams 119, Trinity College, 300 Summit St., Hartford CT 06106 USA 
Email: director @ ieet.org     phone: 860-297-2376