21st Century Threats
Mike Treder
2009-11-10 00:00:00

For example, an asteroid strike could range anywhere from a serious problem to the total extinction of humanity, depending on the size of the impactor. But the probability of such an event occurring within, say, the next 90 years is quite low (and the larger the asteroid, the smaller the chance it will happen any time soon).

That doesn't mean, of course, that a major asteroid strike can't occur within this century -- or even within this year, for that matter. We should be aware of the potential danger and do whatever is reasonably possible to prepare for and possibly avert it.

So, an asteroid impact has to be among the events we should consider as possible threats the survival of civilization in the 21st century. Its severity could be quite high, but its certainty can be classified as comparatively low.

We can place it within a matrix where one axis represents the level of impact from mild to extreme and the other axis estimates the probability of occurrence:



To put the danger of an asteroid strike into context, we need to compare it with other threats, such as nuclear war, a global pandemic, or a "gray goo" disaster.

On the next chart, I've added those three items, along with bio-terrorism, climate change, and an "unfriendly A.I." that decides to wipe us all out:



Taking them roughly in ascending order of concern, from lesser to greater certainty, and from least to most severe, we have...

Gray Goo: This is a fear that was more prevalent a few years ago, until it was recognized that updated proposals for molecular manufacturing do not include the use of self-replicating "nanobots."

Asteroid Impact: As stated above, the potential damage from a large impactor, whether asteroid or comet, is severe. However, the likelihood of this occurring within the next century is, fortunately, quite low.

Unfriendly A.I.: Assuming that human-quality intelligence can be replicated on a machine (which seems probable, though not certain), and assuming that that an artificial intelligence develops the ability to recursively improve its own programming (conceivable, but far from certain), and assuming that said A.I. becomes malevolent or at least disinterested in human survival (at this point just a speculative assertion), and assuming that the A.I. somehow can seize control of power networks and manufacturing facilities, etc. (a nice story plot, perhaps, if extremely far-fetched), then this scenario could represent a threat to the survival of civilization. Possible severity very high, certainty very low.


Now we move to the realm of things that represent a higher probability of danger...

Bio-terrorism: Although we can't say it definitely will happen, the odds do seem pretty fair that at some point in this century, terrorist actors will either manufacture or get their hands on enough toxic bio-substances to cause a significant disaster. On a scale of civilizational threat, however, the impacts would likely be fairly low (what IEET Senior Fellow Jamais Cascio calls a "regional catastrophe").

Nuclear war: Thus far, we've dodged the bullet. But massive amounts of nuclear warheads still exist in the arsenals of the United States and Russia, and lesser powers like North Korea, Pakistan, and Israel are in the game too, so we definitely can't rule out the possibility of nuclear war during the 21st century. If and when it does happen, the impact could be severe, though probably (hopefully) not existential.

Global Pandemic: A fairly high level of certainty as mass migrations increase, cities become more crowded, and climate change (see below) allows the spread of disease-carrying hosts into previously inhospitable areas. Chances are that outbreaks will be contained, with great effort, but expect millions to die.

Climate Change: The only item on this list that represents a high certainty of occurring. In fact, it's already happening. Still unknown is the eventual level of severity. That will depend partly on our actions (or lack thereof) during the next decade or two, partly on the effects of carbon cycle feedbacks that we know too little about, and partly on whatever mitigating or ameliorating impacts we can gain from emerging technologies. Odds are that the severity by the end of this century will be quite high, though probably not threatening to the survival of civilization.


The idea here is that we can and should set some priorities on which potential dangers should receive the most emphasis and greater funding. Those priorities should be based not just on the severity of a given threat, but also on its probability of occurring.