On Monday, January 4, 2016, the History Channel aired a 2-hour documentary called “The Seven New Signs of the Apocalypse” at 9pm ET.
Subscribe to IEET Lists Daily News Feed
Longevity Dividend List
Catastrophic Risks List
Biopolitics of Popular Culture List
Technoprogressive List
Trans-Spirit List
Seth Baum Topics
This is the introductory editorial to the Futures special issue. It was co-written with Bruce E. Tonn. Humanity faces a range of threats to its viability as a civilization and its very survival. These catastrophic threats include natural disasters such as supervolcano eruptions and large asteroid collisions as well as disasters caused by human activity such as nuclear war and global warming. The threats are diverse, but their would-be result is the same: the collapse of global human civilization or even human extinction.
IEET co-founder Nick Bostrom, IEET Fellow Wendell Wallach and Affiliate Scholar Seth Baum are Principal Investigators on projects n funded by Elon Musk and the Open Philanthropy Project and administered by the Future of Life Institute.
Reducing the risk of major, permanent global catastrophe is arguably the most important priority for humanity today. The reason is simple: Such a catastrophe threatens countless members of future generations. Indeed, it is the difference between success or failure for human civilization. If humanity succeeds at avoid catastrophe, it can go on to achieve amazing things across the universe. If humanity fails, everyone could all die. Clearly, reducing the risk of such global catastrophe is a worthy goal. But, in practical terms, what are the be...
Back in 2012, I was invited to spend a few weeks visiting at the Research Institute for Humanity and Nature (RIHN), a federally funded Japanese research institute based in the beautiful city of Kyoto. I was invited by my colleague Itsuki Handoh of RIHN. During my visit, Handoh and I came up with an idea for how to fuse two important lines of research on major global threats.
It could be difficult for human civilization to survive a global catastrophe like rapid climate change, nuclear war, or a pandemic disease outbreak. But imagine if two catastrophes strike at the same time. The damages could be even worse. Unfortunately, most research only looks at one catastrophe at a time, so we have little understanding of how they interact.
A discussion of ethical and legal issues arising from bioengineered technologies that could benefit humanity or pose risk of a global catastrophe.
The Syrian civil war has already caused over 100,000 deaths. As tragic as this is, it is miniscule compared to the massive and potentially permanent global destruction that could come from the gigaton gorilla lurking in the background: nuclear war between the United States and Russia. While the U.S. and Russia find themselves on opposite sides in Syria, their diplomacy over Syria's chemical weapons could help build the trust and confidence needed to reduce the risk of nuclear war.
GCRI has a new academic paper out. Deepwater Horizon and the Law of the Sea: Was the cure worse than the disease?, by Grant Wilson, has been accepted for publication in a law journal to be determined.
Emerging technologies like bioengineering, nanotechnology, artificial intelligence, and geoengineering have great promise for humanity, but they also come with great peril. They could revolutionize everything from pollution control to human health—imagine a bioengineered microbe that converts CO2 into automobile-worthy liquid fuels, or nanotechnologies that target cancer cells.
But they also pose the potential to cause a global catastrophe in which millions or even billions of people die.