Fans of Game of Thrones were treated to a big piece of news last week. As audiences know, the fan-favorite character Jon Snow was left to die at the hands of his Night’s Watch Brothers at the end of the previous season. Yesterday, a poster was revealed showing a bloodied image of the character.
I would argue that as far as imagining the future is concerned many of us, in the West at least, have had our vision blurred from what amounts to a 2,000 year philosophical hangover called Christianity. But no one ever seems to care about this point. The most common response I’ve gotten from a certain sect of singularitarians and transhumanists upon pointing out that both their goals and predictions seem to have been ripped from a man on the street’s version of Christianity has been- who cares?
Singularity University is expanding through the SingularityU Global program. The launch of SingularityU Milan, the first Italian chapter, is part of this program. It allows orders of magnitude more people to directly participate in its events and leverage the power of exponential technologies.
They used to send a legal ultimatum before it happened. Now you just wake up one day and everything green is dead, because the plants are biotech and counter-hacking is a legal response to intellectual property theft, even if the genes in question are older than the country that granted the patent.
Our lives are shaped by smarter and smarter machines, helping us in big and small decisions. Corporations are busy developing products and services based on artificial intelligence. The moral implications of this symbiotic relationship must be deeply explored. A wide and inclusive public conversation is needed to address the issues they are raising.
Why is it that the bacon you are about to bite into is an acceptable source of food for you, but possibly not so for the person sitting next to you? Perhaps he or she eats according to a religious code, or has a health-related reason for skipping the meat products. Maybe he or she is a proponent of animal welfare and has decided to only eat meat products that are slaughtered “transparently and humanely”; or, it could be that he or she has decided not to eat an animal that is conscious on any level.
Les transhumanistes, en bons humanistes, pensent que l’humain est perfectible, et ceci est valable aussi bien pour ses caractéristiques physiques que morales. La différence réside surtout en ce que, à l’effet de la philosophie, de l’éducation, de la culture ou de la loi, c’est-à-dire du consensus politique, ils estiment que nous sommes maintenant en mesure d’ajouter la technologie pour contribuer à cette amélioration continue (et non la leur substituer, comme se plaisent à l’écrire de nombreux commentateurs pressés). Or, malgré des siècles de législation, de culture, d’éducation et de philosophie, les progrès de ce que les philosophes des Lumières appelaient la Vertu semblent buter sur ce qui reste jusqu’à aujourd’hui la condition biologique de l’humain.
Andreas Antonopoulos’s articulation of network-enforced trust primitives (Oct 2015, Feb 2014) could be extended more broadly into the concept of Machine Trust Language (MTL). While blockchains are being popularly conceived as trust machines, and as a new mode of creating societal shared trust, Andreas addresses how at the compositional level, this trust is being generated. The key idea is thinking in terms of a language of trust, of its primitives, its quanta, its elemental pieces, its phonemes, words, and grammar that can be assembled into a computational trust system.
The West African Ebola outbreak is finally starting to approach manageable levels, after nearly 18 excruciating months and over 11,000 lost lives. Here’s what the current situation on the ground looks like and how the battle against Ebola finally might be won.
This is the largest and longest Ebola outbreak in human history. At its peak, there were 950 confirmed cases each week, prompting fears of a global pandemic. Officials have reported 28,421 confirmed, probable, and suspected cases in Guinea, Liberia, and Sierra Leone. Of these, some 11,300 people have died — a fatality rate of 40%. A total of 881 healthcare workers have been infected; of those, 513 died.
Okay, when do you ever see some (rational) person take one of Donald Trump’s wild, paranoid rants and declare “he didn’t go anywhere near far enough”?
Well, I am about to do that. He has lately taken flack for being the first prominent figure to (at long last) connect the dots and publicly lay at least partial blame for the 9/11 attacks at the feet of President George W. Bush, the man who was not only captain at the helm, but proximately responsible under any adult standard.
I suppose nearly everyone reading this blog post is already aware of the flurry of fear and excitement Oxford philosopher Nick Bostrom has recently stirred up with his book Superintelligence, and its theme that superintelligent AGI will quite possibly doom all humans and all human values. Bostrom and his colleagues at FHI and MIRI/SIAI have been promoting this view for a while, and my general perspective on their attitudes and arguments is also pretty well known.
Most boundaries have their origin in our fears, imposed in a vain quest of isolating what frightens us on the other side. The last two centuries have been the era of eroding boundaries, the gradual disappearance of what were once thought to be unassailable walls between ourselves and the “other”. It is the story of liberation the flip-side of which has been a steady accumulation of anxiety and dread.
(Transcript of the speech presented at Lincoln Center, New York, at the conference Global Future 2045: Towards a New Strategy for Human Evolution.)
I am going to discuss whole brain emulation, about what it takes to reverse engineer a mind. This is a topic that you’ve heard mentioned a few times over, that term at least (at least during the conference), and several of the speakers that you saw today - and more that are coming up - are going to be talking about technologies, or have talked about technologies, that address a specific part of that. But I want to show: How does all this come together? How could you reverse engineer a mind? And I wanted to show: How do you actually determine the goals for something like that?
If you’ve watched any James Bond movie with an underwater scene, you’ve likely seen 007 menaced by some form of the villains’ sinister undersea robots. In 2015, thanks to the efforts of Author, Biomimetics Researcher, and Neurophysiology Professor Joseph Ayers, undersea robots are a reality, and the future applications of his RoboLobster are far from evil.
The recent news that womb transplants will be trialled in the UK has sparked much debate regarding the desirability of this and other future infertility interventions. Perhaps unsurprisingly, the idea of artificial wombs has been brought into this discussion, complete with the usual concerns about women’s reproductive liberty.
What will the future look like? The further upwards one moves from the basement domain of physics, the harder it often gets to predict long-term trends. Nonetheless, we have some fairly good clues about what to expect moving forward.
The more means by which people can act the easier attack becomes and the harder defense becomes.
It’s a simple matter of complexity. The attacker only needs to choose one line of attack, the defender needs to secure against all of them. This isn’t just true of small thermal exhaust ports, it’s true in our software ecosystems today and any other system with many dimensions of movement.
Complexity, more degrees of freedom within a system, allow for greater attack surface. When they can come not just from all points on the compass but from above and below as well.
At the IEET and Brighter Brains conference this past weekend in Oakland, CA, I had the pleasure of meeting an older man who had thought a lot about the future—and he was very afraid. Science, he said, was going to destroy us. And worse, when robots are better than us, what is the purpose of the human being?
“Suddenly he heard a groan—his teeth chattered,” writes Washington Irving about the frightened teacher Ichabod Crane in The Legend of Sleepy Hollow. Crane finds himself in headless-horseman territory, tentatively riding his stubborn horse along a dark road. In what many regard as the first American horror story, Irving (2006) describes Crane’s charged emotional state:
A much better metaphor for gene therapy is space-alien hackers attacking a huge factory covering San Francisco. These hackers shoot canisters of paper-tape instructions for the old computer controlled machinery into this billion year old factory.
IEET Blog |
email list |
The IEET is a 501(c)3 non-profit, tax-exempt organization registered in the State of Connecticut in the United States.
East Coast Contact: Executive Director, Dr. James J. Hughes,
56 Daleville School Rd., Willington CT 06279 USA
Email: director @ ieet.org phone:
West Coast Contact: Managing Director, Hank Pellissier
425 Moraga Avenue, Piedmont, CA 94611
Email: hank @ ieet.org