It could be difficult for human civilization to survive a global catastrophe like rapid climate change, nuclear war, or a pandemic disease outbreak. But imagine if two catastrophes strike at the same time. The damages could be even worse. Unfortunately, most research only looks at one catastrophe at a time, so we have little understanding of how they interact.
A discussion of ethical and legal issues arising from bioengineered technologies that could benefit humanity or pose risk of a global catastrophe.
The Syrian civil war has already caused over 100,000 deaths. As tragic as this is, it is miniscule compared to the massive and potentially permanent global destruction that could come from the gigaton gorilla lurking in the background: nuclear war between the United States and Russia. While the U.S. and Russia find themselves on opposite sides in Syria, their diplomacy over Syria's chemical weapons could help build the trust and confidence needed to reduce the risk of nuclear war.
GCRI has a new academic paper out. Deepwater Horizon and the Law of the Sea: Was the cure worse than the disease?, by Grant Wilson, has been accepted for publication in a law journal to be determined.
Emerging technologies like bioengineering, nanotechnology, artificial intelligence, and geoengineering have great promise for humanity, but they also come with great peril. They could revolutionize everything from pollution control to human health—imagine a bioengineered microbe that converts CO2 into automobile-worthy liquid fuels, or nanotechnologies that target cancer cells.
But they also pose the potential to cause a global catastrophe in which millions or even billions of people die.
This past December I was at the 2012 Annual Meeting of the Society for Risk Analysis. Several sessions focused on emerging technologies governance. Each presentation nominally focused on one technology, mainly synthetic biology and nanotechnology. But most of the ideas discussed applied equally well to any emerging technology.
This paper develops a mathematical modeling framework using fault trees and Poisson processes for analyzing the risks of inadvertent nuclear war from U.S. or Russian misinterpretation of false alarms in early warning systems, and for assessing the potential value of inadvertence risk reduction options. The model also uses publicly available information on early-warning systems, near-miss incidents, and other factors to estimate probabilities of a U.S.-Russia crisis, the rates of false alarms, and the probabilities that leaders will launch missiles in response to a false alarm. The paper discusses results, uncertainties, limitations, and policy implications.
Perceived failure to reduce greenhouse gas emissions has prompted interest in avoiding the harms of climate change via geoengineering, that is, the intentional manipulation of Earth system processes. Perhaps, the most promising geoengineering technique is stratospheric aerosol injection (SAI), which reflects incoming solar radiation, thereby lowering surface temperatures.
Mankind has really popped the planet in the jaw the last few centuries: six million hectares is lost to deforestation every year; the ocean is increasingly acidic and void of fish; the planet’s sixth mass extinction seems to be underway; and human-caused climate change is already raising sea levels, aggravating droughts, and increasing the frequency and intensity of extreme weather events like Hurricane Sandy.