It’s 2010 — our 2010 — and an artificial intelligence is one of the most powerful entities on Earth. It manages trillions of dollars in resources, governments shape their policies according to its reactions, and, while some people revere it as literally incapable of error and others despise it as a cathastrophic tyrant, everybody is keenly aware of its existence and power.
Humans are animals that build tools to enhance physiology. It is the use of tools that helped to increase the human brain into a larger, more complex system than that of early hominids. “Tools and bigger brains mark the beginning of a distinctly human line of evolution.” (Kelly 2010, 22) According to Jared Diamond, early hominids lacked innovation: “In short, Neanderthal tools had no variation in either time or space to suggest that most human characteristics, innovation”. (Diamond 2006, 44) What will we do with nanotechnology and AGI?
Doug Rushkoff was interviewed by progressive journalist Laura Flanders for Grit TV about his new book Program or Be Programmed: We need to take control of the new computer networking tools all around us, argues author and thinker Douglas Rushkoff, or else we’ll wind up at the mercy of those who do take control. That’s part of the argument Rushkoff makes in his new book, Program or Be Programmed, out now from our friends at OR Books. With some basic computer and programming literacy, Rushkoff notes, we can take control of our lives, create value for ourselves, and perhaps let the big institutions that think they control us, from banks to media moguls, just wither away.
Peter Dickins has penned a provocative article in the Monthly Review: The Humanization of the Cosmos—To What End? Dickins approaches the subject of space colonization from a decidedly leftist perspective, and is wonders how the process can unfold without the exploitation of humans and the environment.
I am writing this after having responded to a respected friend, a bioethicist with whom I am connected via Facebook. In his photo albums, he has a picture of a protected area for dogs in Thailand. This got me thinking.
Every generation had legends of a coming downfall. Whether you call it The End Times, Armageddon, Apocalypse, Doomsday, Ragnorak, The Population Bomb….we’ve long been fascinated by prophecies of devastation and doom.
Scientist and best-selling novelist David Brin explores the concepts and facts behind end-of-the-world tales, and how modern civilization can start limiting the risk.
In science fiction, when humanity is faced with existential crises, we turn to great minds attached to great hearts. While we aren’t under alien attack or facing sentient machines, our world has its own share of problems. Human cognitive enhancement might just be the solution from which all other solutions are born; or maybe it brings too many risks of its own.
Slate magazine and New America Foundation are holding a seminar on the biology and policy implications of radical life extension today, with help from the IEET’s Sean Hays and with IEET Fellow Aubrey de Grey as a speaker.
How old is too old? Some scientists think the body has a metabolic stop-sign at about age 122; others think that through new technologies, genetics, and robotics we can expand our longevity to a quarter millennium. And one man, IEET Fellow Aubrey de Grey, thinks immortality is possible â€” that the first human who will reach 1000 years of age has already been born.
But with great age our assumptions of life, family, work, taxes, government, health, sexâ€¦ our humannessâ€¦would change. Are you ready for the long life?
Click here to listen to an interview featuring Aubrey de Grey and Joel Garreau.
What if America lost its knack for making things? IEET Fellow David Brin’s new graphic novel Tinkerers is set in the year 2024, and combines art with history and tech to explore where the U.S. went wrong.
There have been monsters in fiction ever since there was any fiction at all. They are — always — scary, and sometimes attractive. But during the last years they have also began to be something else, something never seen before: they are our colleagues.
Jeffrey Toobin talks with Tim Wu, a professor at Columbia Law School and the author of The Master Switch: The Rise and Fall of Information Empires, about how forms of communication, from the telephone to the Internet, are eventually controlled by monopolies; the battle between Apple and Google; and the future of information technology.
The quantified self movement is really starting to gain some steam, mostly on account of a slew of new technologies and services that are making personalized metrics easier and more meaningful. It’s truly a case where the dream is coming true; in short order we will be able to track the most minute details of our body’s functioning, have that data analyzed, and given a set of prescriptions to help us optimize our health based on a predetermined set of goals.
Dr. J. chats with Max More, founder of the Extropy Institute and one of the founders of contemporary transhumanism. They discuss the relationship of transhumanism and religion, virtue theory versus utilitarianism and the ethical and political underpinnings of the extropian worldview. Part 2 of 2. (Part 1 is here)
Dr. J. chats with neuroscientist William Church about his exploration of the relationship of religion and science, and his hope that the two can eventually be mutually enriching instead of antagonistic. Part 2 of 2. (Part 1 is here)
WBUR’s On Point talked with big thinker Douglas Rushkoff about his â€œten commandsâ€ for living right in the digital age.
The digital world around us - Facebook, Google, and all the rest - has grown so big, so fast, that people come to think of it as a given, like gravity or the speed of light. Of course, it’s not. The digital world is thoroughly engineered, by human hands, and for human ends, like making money.
Big media critic and theorist Douglas Rushkoff wants to be sure we don’t forget that. Otherwise, he warns, as lives migrate to the digital realm, we run the risk of being slaves, not masters, of its power.
And the thing that gets programmed may be us.
Here are Rushkoff’s â€œ10 commands,â€ as summarized by SXTXState.com:
1. Time. Thou shall not be always on. We are turning an asynchronous net as always on. He encouraged saying â€œMy time is mine.â€
2. Distance. Thou shalt not do from a distance what can be done in person. Using long distance in short distance situations. Don’t use distance learning in localized context.
3. Scale - the Internet is biased to scale up. Exalt the particular. Not everything scales, should scale or needs to scale.
4. Discrete - everything is a choice. You may always choose none of the above. Sites like Facebook promote forced choice, you have to choose from a set of options.
5. Complexity - the net reduces complexity. Thou shalt never be completely right.
6. Non-corporeal - out of body. Thou shalt not be anonymous. Rushkoff says â€œwork against tendency of the net to promote anonymity.â€ Anonymity encourages becoming part of polarized mobs with no sense of consequence, it side steps prejudices. It is liberating to promote yourself online.
7. Contact is king (not content). Remember the humans. â€œSocial marketing is an oxymoron.â€
8. Abstraction - as above, so not below. Print abstracts text from the scribe. Hypertext takes it a step further.
9. Openness. Thou shalt not steal. When there is no social contract, openness can continue until there is no one left to give things away. Nothing is free.
10. End users - technology is biased toward consumers. Programmed or be programmed.