Civilization's Demise
Mike Treder
2008-04-12 00:00:00
URL

Asteroid_impact



From the cover story of New Scientist magazine's April 5 issue:

Doomsday scenarios typically feature a knockout blow: a massive asteroid, all-out nuclear war or a catastrophic pandemic. Yet there is another chilling possibility: what if the very nature of civilisation means that ours, like all the others, is destined to collapse sooner or later?

A few researchers have been making such claims for years. Disturbingly, recent insights from fields such as complexity theory suggest that they are right. It appears that once a society develops beyond a certain level of complexity it becomes increasingly fragile. Eventually, it reaches a point at which even a relatively minor disturbance can bring everything crashing down.

Some say we have already reached this point, and that it is time to start thinking about how we might manage collapse. Others insist it is not yet too late, and that we can -- we must -- act now to keep disaster at bay.




The thrust of the article (which you can read in full here) is that instead of the usual suspects of destruction, the secret culprit that will bring about the end of our modern western civilization is complexity.

On his fine blog (which he describes as "seeing that the best of humanity survives long enough to reach the next level"), Al Fin offers a useful review of and response to the New Scientist article. Fin says:

Complexity of systems is a fairly young field of study. Most of the people quoted in the article do not actually know what they are talking about, not really. They are speaking in abstract terms about theoretical problems that may or may not occur. Academics and think tank scholars -- like journalists and politicians -- are of limited use in dealing with real world problems. If they were competent and effective in the real world, most of them would probably be doing something else. Still, a wise person looks for ideas in a wide array of places.

It is fairly obvious that our complex electronic global financial systems are vulnerable to disruption. In fact, in the next large scale war, one of the first casualties of large nations will be their financial and communications networks. Satellites will be lost, landlines and seafloor cables will be cut, and computer centers will be sabotaged. Yes, there are backups and redundant systems and databases. But access will be a problem for large numbers. For monkey-brained humans, large scale panic sets in fairly easily.

An abrupt cutoff of fuel for ground, rail, and air transportation could likewise lead to large scale hardship and civil disturbances. A loss of electrical power would leave tens or hundreds of millions in complete turmoil. Most cities have only a few days worth of food inside their borders. Rather than confronting these important vulnerabilities, most leaders tend to obscure them and steer public discussion away to other topics.


Fin's description of the next big war incorporates many of the components of 4th generation -- or even 5th generation -- warfare. That entails using sophisticated methods of sabotage to invade and disable key infrastructure, probably taking advantage of new capabilities offered by emerging technologies, including, most ominously, molecular manufacturing. Fin writes:

Modern laboratory tools for biology and chemistry provide many ways to turn an open, trusting society into a shut-down, quivering, fearful society. Soon nanotechnological tools will make even more devastatingly deadly -- and invisible -- weapons readily available. It is easy to send modern western media-centered societies into paralytic states of fear.


Yes, he's talking about the very same risks that CRN has warned about here and in many other places.

But could that same technology that contains the seeds of our destruction also provide remedies that might save our future?

Joseph Tainter, an archaeologist and author of the 1988 book The Collapse of Complex Societies, doesn't think so. From the New Scientist article:

Tainter is not convinced that even new technology will save civilisation in the long run. "I sometimes think of this as a 'faith-based' approach to the future," he says. Even a society reinvigorated by cheap new energy sources will eventually face the problem of diminishing returns once more. Innovation itself might be subject to diminishing returns, or perhaps absolute limits.

Studies of the way cities grow by Luis Bettencourt of the Los Alamos National Laboratory, New Mexico, support this idea. His team's work suggests that an ever-faster rate of innovation is required to keep cities growing and prevent stagnation or collapse, and in the long run this cannot be sustainable.


And the results of that collapse would not be pretty:

The stakes are high. Historically, collapse always led to a fall in population. "Today's population levels depend on fossil fuels and industrial agriculture," says Tainter. "Take those away and there would be a reduction in the Earth's population that is too gruesome to think about."


But Al Fin begs to differ. He proposes upgrading or enhancing human intelligence with the aid of rapid technological innovation:

The greatest threat to our civilisation is also the greatest promise -- the possibility that something better will be devised. That better civilisation will have its vulnerabilities, yes. But the participants in the next level should have some important upgrades to their monkey-brains which will allow them to consider more contingencies, and devise better ways to deal with them.

Even our best thinkers can be quite slipshod at times. We have computers to help us with that, but some problems of pompous overreach cannot be compensated or corrected by computers. We can use human-level AI, certainly. But we need to become smarter, more than our machines. If we put our fate trustingly into the hands of machines that we can never understand, have we not just traded up to an even more dangerous vulnerability?


That sounds like an exceedingly narrow ridge to traverse, between collapse of civilization on one side through overuse or misuse of technology, and replacement of our current civilization on the other side through surrender to the machines.

Ridge

Staying on the knife-edge between those two slippery slopes and coming out at the end with enhanced humans somehow maintaining and extending our historic societal structures seems improbable to me.

More likely, in my view, is that our civilization will end, as has every one before it, and that its replacement will be a new civilization built upon the old. But the entities creating and inhabiting that new order will not be humans as we know them. Instead, the majority will be posthumans of some form or other, perhaps human-machine hybrids, bio-engineered chimeras, redundant virtual superintelligences, or something else we can't even imagine.

Or, we may simply be doomed to a repeated cycle of birth, expansion, and then death, with one civilization coming after another, but none surviving long enough to escape our puny solar system's envelope of mortality.