IEET > Rights > HealthLongevity > GlobalDemocracySecurity > Vision > Fellows > Jamais Cascio > Futurism > Technoprogressivism
Watching the World through a Broken Lens
Jamais Cascio   Mar 20, 2014  

It’s often frustrating, as a foresight professional, to listen/read what passes for political discourse, especially during a big international crisis (such as the Russia-Ukraine-Crimea situation). Much of the ongoing discussion offers detailed predictions of what one state or another will do and clear assertions of inevitable outcomes, all with an overwhelming certainty of anticipatory analysis. - See more here.

Of course, these various prognostications will almost always be wrong; worse, they'll typically be wrong in a useless way, having obscured or confused our understanding of the world more than they've illuminated it.

It's not just a peculiarity of Central European crises. We can see a similar process play out in nearly every global-scale system with consequences beyond the immediate, economically, militarily, or politically. Detailed claims about imminent inflation or the arrival of an Iranian nuclear weapon by the end of the year get treated as gospel up to the moment when the assertion is shown to be wrong, after which the previous statement drops down the memory hole and is replaced by one about a new threat of imminent inflation or the arrival of an Iranian nuclear weapon by the end of the new year. Those who inflict this Potemkin futurism on us -- predictions without substance portrayed as careful analysis of future outcomes -- never suffer the consequences of being wrong. Anyone offering more subtle or complex analysis will be treated at best as having just another opinion, or even actively ignored if what they say runs counter to the conventional wisdom.

This prediction-error-prediction cycle isn't just a feature of television or Internet punditry. As I've mentioned before, I did my graduate work in political science, and ultimately erroneous predictions dripping with certainty are commonly found in this realm as well. Unlike most other social sciences, political science has to balance both analysis of past+present conditions and grounded forecasts of the implications of those conditions. When there's a revolution in Country X, you'll rarely see an Anthropologist or Social Psychologist quoted in mainstream discussions of What This Means; conversely, you're almost guaranteed to get a juicy quote or two from an academic in the Department of Government and Conventional Wisdom at Ivy-Covered

This is not a dilemma without a solution, however. Professional Foresight (aka Futurism) also went through a period where specialists would offer up a single prediction of a certain future. In more recent decades -- arguably since Hermann Kahn's On Thermonuclear War in 1960, but more generally since the advent of Shell-derived Scenario Planning in the 1990s -- futurism has been more comfortable with uncertainty, and more willing to offer multiple rival forecasts of possible outcomes instead of singular, certain predictions. Multi-scenario foresight has evolved various iterations since then, but they all come down to a core idea: you can't predict the future, but you can see the shape of different possible futures.

So what would this model look like if employed by political pundits and political science academics? To be honest, it would probably be confusing, and make for bad television. We as a civilization have a bias towards spectacle and a preference for detail over generality; a talking head saying "this could happen, or that, or this other thing, they're all plausible outcomes" will be squished by someone with a loud voice and absolute certainty.

Certain but wrong usually beats complex and observant. Enjoy your future.


Jamais Cascio is a Senior Fellow of the IEET, and a professional futurist. He writes the popular blog Open the Future.


By far the best work on this topic is Philip Tetlock’s book:
“Expert Political Judgment: How Good Is It? How Can We Know?”
Tetlock shows just how bad predictions by pundits are—- you might as well just ask your cab driver, and you cab driver is more likely to get things right. 

A big distinction Tetlock sees in terms of getting predictions right is between “foxes and hedge hogs”: “the fox knows many things, but the hedgehog knows one big thing”. A hedge hog would be someone with one big idea who therefore tries to compress complex reality to fit that idea, while a fox is not ideological in this sense. Foxes get predictions right much more often than hedge hogs.

YOUR COMMENT Login or Register to post a comment.

Next entry: Has the Technology Industry Ruined San Francisco: You Decide…

Previous entry: Effective Altruism