Why Obstinate Humans Find It Hard To Believe Science
David Brin
2011-04-24 00:00:00
URL




Not even those of us who are scientifically trained actually do objective science consistently well. Like all other humans, we are predisposed, with biased, emotionally prejudiced human minds, to first see what we want or expect to see -- a dilemma first illustrated by Plato as "The Allegory of the Cave."

In one of the few things Plato got right, he showed how each of us allows our subjective will to overlay and mask anything inconvenient about the objective world...
cave


Now Chris Mooney, author of The Republican War on Science, explains how this age-old human flaw is being analyzed in scientific detail, by researchers who reveal it to be dismayingly intractable. It seems that obstinacy is as deeply rooted as love or sex!

From Mooney's new article, "The Science of Why We Don't Believe Science":

Reasoning is actually suffused with emotion (or what researchers often call "affect"). Not only are the two inseparable, but our positive or negative feelings about people, things, and ideas arise much more rapidly than our conscious thoughts, in a matter of milliseconds—fast enough to detect with an EEG device, but long before we're aware of it.

That shouldn't be surprising: Evolution required us to react very quickly to stimuli in our environment. It's a "basic human survival skill," explains political scientist Arthur Lupia of the University of Michigan. We push threatening information away; we pull friendly information close. We apply fight-or-flight reflexes not only to predators, but to data itself.

We're not driven only by emotions, of course—we also reason, deliberate. But reasoning comes later, works slower—and even then, it doesn't take place in an emotional vacuum. Rather, our quick-fire emotions can set us on a course of thinking that's highly biased, especially on topics we care a great deal about.


Of course, there's hope, or we would never have climbed so far. In the last few centuries, we discovered a general way around this dilemma. It is through the enlightenment process that underlies almost everything successful about our civilization -- not only science but also free markets, justice and democracy. It is the one tool that has ever allowed humans to penetrate the veil of their own talented delusions.
citokate
It is called Reciprocal Accountability. Or criticism, the only known antidote to error.

We may not be able to spot our own mistakes and delusions, but others will gladly point them out for us! Moreover, this favor is one that your FOES will happily do for you! (How nice of them.) And, in return, you will eagerly return the favor.

In our Enlightenment -- and especially in science -- this process is tuned to maximize truth-output and minimize blood-on-the-floor. But it requires some maturity. Some willingness to let the process play out. Willingness to negotiate. Calmness and even humor.

It doesn't work amid rage or "culture war." Which is precisely why culture war is being pushed on us. By those who want the Enlightenment to fail.

And that brings us back to Mooney's cogent and detailed article, which explains the problem of "narrowcasting" to specifically biased audience groups, who get to wallow in endless reinforcement of their pre-existing views, avoiding the discomfort of cognitive dissonance from things like evidence ...

... a problem -- exacerbated by the Internet age -- that I predicted in my 1989 novel Earth, describing a near future in which people shift their attention only to those sources that confirm and reinforce their pre-existing beliefs. (A forecast I would rather not have seen come true.)