“The world will someday end with fire or ice, but we await clarification as to the proximate causes. The menu of looming catastrophes is a long one, growing with our advancing knowledge of the universe and powers of self-immolation.”
Craig Lerner, associate dean for academic affairs and professor of law at the George Mason University School of Law, offers his opinion on the recent anthology, which includes chapters by several IEET fellows and associates.
Global Catastrophic Risks, a collection of two dozen learned and generally balanced essays, canvasses this dismal scene and dishes up warnings and advice. This is a book in which shriveling retirement accounts and the looming bankruptcy of the automobile industry do not register. The editors, Nick Bostrom and Milan Cirkovic, direct our gaze at “existential risks,” dangers so grave that, should they happen once, “there would be no opportunity to learn from the experience.” These are catastrophes that threaten humanity, intelligent life of any kind, and possibly all life on earth.
Or worse. Imagine, if you will, “permanent and extreme forms of slavery or mind control” at the hands of a genetically enhanced Stalin or supercharged computer. Extinction would be a blessing.
As a species, we face threats from the cosmos and from ourselves. Geological records suggest five ruptures over the past half-billion years, when most of the then extant species died out . . .
Then there are the new threats, of man’s own making. Two chapters consider nuclear war and terrorism, but others sketch far more imaginative and comprehensive catastrophes. The nascent fields of genetic engineering, nanotechnology, and artificial intelligence may give rise to weapons more powerful by multiples than anything we can conceive today. Furthermore, building an atomic bomb is complicated; this may be less true for newer technologies . . .
Monitoring every nation-state determined to build an atomic bomb is hard enough; monitoring every rogue scientist and angst-ridden teenager bent on Armageddon may prove impossible. As Chris Phoenix and Mike Treder note in their chapter on nanotechnology, “The likelihood of at least one powerful actor being insane is not small.”
Unfortunately, Lerner lets his skepticism of global warming science (and his libertarian politics) dominate much of the review, which takes considerable space away from other threats covered in the book. Moreover, at the end he tacks on a religious message implying that God is in charge, so don’t worry, be happy. I’m not reassured.
We can hope, though, that at least a few of those who see his review will be motivated to find the book, read it for themselves, and form their own opinions.
Mike Treder is a former Managing Director of the IEET.