Our Final Hour
John G. Messerly
2015-02-09 00:00:00
URL

He gives numerous examples of forecasts that didn’t come true, of technologies that were not forecast, and of the forecasts that were never made—x-rays, nuclear energy, antibiotics, jet aircraft, computers, transistors, the internet, and more. Yet,

… we cannot set limits on what science can achieve, so we should leave our minds open, or at least ajar, to concepts that now seem on the wilder shores of speculative thought. Superhuman robots are widely predicted for mid-century. Even more astonishing advances could eventually stem from fundamentally new concepts in basic science that haven’t yet even been envisioned and which we as yet have no vocabulary to describe. (16)

Rees argues that computing power will not level off and “nanotechnology could extend Moore’s law for up to thirty years further; by that time, computers would match the processing power of a human brain.” (17) He quotes both Kurzweil and Moravec and takes their predictions seriously.

Rees accepts as reasonable speculative claims concerning the malleability of our physical and psychic selves. He also acknowledges that immortality may be possible. He discusses reverse-engineering a brain in order to download its contents into a machine, saying: “If present trends continue unimpeded, then … some people now living could attain immortality—in the sense of having a lifespan that is not constrained by their present bodies.” (18-19)

Rees also believes that superintelligent machines might destroy us. Once machines have surpassed human intelligence, they could themselves design and assemble a new generation of even more intelligent ones. This could then repeat itself, with technology running towards a cusp, or ‘singularity’.” Still, Rees admits this is all speculative.

I see Rees as forging a middle path. He recognizes that the potential of scientific knowledge to transform reality, but he cautions us that some predictions are fanciful. Many forecasts will be shown to be mistaken, and many things we don’t forecast will happen. Moreover there are social, religious, political, ethical, economic and other considerations that impede swift development of new technologies.

Rees also carefully considers extinction scenarios: “Throughout most of human history, the worst disasters have been inflicted by environmental forces—floods, earthquakes, volcanoes, and hurricanes—and by pestilence. But the greatest catastrophes of the 20th century were directly induced by human agency…” (25) He estimates that nearly two hundred million persons were killed by war, massacre, persecution, famine, etc. in the 20th century alone.

The primary extinction scenarios include: global nuclear war; nuclear mega-terror; bio-threats (the use of chemical and biological weapons); laboratory errors  (accidentally create a new virulent smallpox virus, for example); “grey goo” (nanobots out of control that consume all organic matter); particle physics experiments gone awry; and human-induced environmental or climate change. In addition, there are asteroid impacts; super-eruptions from Earth that block the sun; and more.

Martin Rees is one of the world’s most important living scientists. His worries about the extinction of the species should be carefully considered.