A Test to Measure How Rational You Really Are
George Dvorsky
2013-07-05 00:00:00
URL



Keith E. Stanovich is Professor of Human Development and Applied Psychology at the University of Toronto. The author of over 200 scientific articles and seven books, he, along with Richard West, was recently given a grant by the John Templeton Foundation to create the first comprehensive assessment of rational thinking — a test that will ultimately determine a person's 'rationality quotient'.



And indeed, the value of rationality and “good thinking” tends to be diminished by the importance we place on intelligence. But as we learned from Stanovich, the two often have very little to do with each other.



Don't standard IQ tests already measure for rationality — and doesn't intelligence correlate with rationality?



IQ [tests] do not at all directly assess processes of rational thinking, as they are defined by cognitive scientists.



This is why a separate test is needed to assess RQ. We — my colleague Richard West and I — were led many years ago to through our longstanding interest in the heuristics and biases research program inaugurated by Daniel Kahneman and Amos Tversky several decades ago.



This all got started back in 2002 when Kahneman won the Nobel Prize in Economics (Tversky died in 1996). The press release for the award from the Royal Swedish Academy of Sciences drew attention to the roots of the award-winning work in “the analysis of human judgment and decision-making by cognitive psychologists.”



Kahneman was lauded for discovering “how human judgment may take heuristic shortcuts that systematically depart from basic principles of probability. His work has inspired a new generation of researchers in economics and finance to enrich economic theory using insights from cognitive psychology into intrinsic human motivation.”



One reason that the Kahneman and Tversky work was so influential was that it addressed deep issues concerning human rationality. Their work, along with that of many others, has shown how the basic architecture of human cognition makes all of us prone to these awful errors of judgment and decision making.



But being prone to these errors does not mean that we always make them. Every person, on some occasions, overrides the tendency to make these reasoning errors and instead makes the rational response. It is not that we make errors all the time. Even more importantly, our research group has shown that there are systematic differences among individuals in the tendency to make errors of judgment and decision making.



And the fact that there are systematic individual differences in the judgment and decision making situations studied by Kahneman and Tversky means that there are variations in important attributes of human cognition related to rationality — how efficient we are in achieving our goals.



This fact is curious because most laypeople are prone to think that IQ tests are tests of, to put it colloquially, good thinking. Scientists and laypeople alike would tend to agree that “good thinking” encompasses good judgment and decision making — the type of thinking that helps us achieve our goals. In fact, the type of “good thinking” that Kahneman and Tversky studied was deemed so important that research on it was awarded the Nobel Prize. Yet assessments of such good thinking are nowhere to be found on IQ tests.



Are you interested in how cognitive biases affect rationality? Which ones should we be most aware of?



Absolutely. Cognitive biases are an essential part of the modern definition of rationality in cognitive science.



To think rationally means taking the appropriate action given one’s goals and beliefs — what we call instrumental rationality — and holding beliefs that are in synch with available evidence, or epistemic rationality. Collectively, the many tasks of the heuristics and biases program — and the even wider literature in decision science — comprise the operational definition of rationality in modern cognitive science (see my book Rationality and the Reflective Mind, 2011).



Let me give you some examples of instrumental rationality and irrationality:





Likewise, they have studied aspects of epistemic rationality and irrationality, such as:





You just started a 3-year project to create the first comprehensive assessment of rational thinking. Why do we need such a thing?



All of the biases and processes listed above will be on our prototype measure of rational thinking that will be the outcome of our grant from the John Templeton Foundation. It is necessary to assess them directly on such a test because none of them are directly assessed on IQ test.



However, there is an important caveat here. Although the tests fail to assess rational thinking directly, it could be argued that the processes that are tapped by IQ tests largely overlap with variation in rational thinking ability.



Perhaps intelligence is highly associated with rationality even though tasks tapping the latter are not assessed directly on the tests. Here is where empirical research comes in — some of which has been generated by our own research group. We have found that many rational thinking tasks show surprising degrees of dissociation from cognitive ability in university samples. Many classic effects from the heuristics and biases literature — base-rate neglect, framing effects, conjunction effects, anchoring biases, and outcome bias — are only modestly related to intelligence if run in between-subjects designs.



Most rational thinking tasks correlate to some degree with intelligence, but the correlation is almost always moderate enough (.60 or so at the very highest) to still create many cases where intelligence and rationality are out of synch. Hence my coining of the term dysrationalia in the early 1990s.



And what do you mean by "dysrationalia"?



I coined the term dysrationalia — an analogue of the word dyslexia — in the early-1990’s in order to draw attention to what is missing in IQ tests. I define dysrationalia as the inability to think and behave rationally despite having adequate intelligence. Many people display the systematic inability to think or behave rationally despite the fact that they have more than adequate IQs.



One of the reasons that many of us are dysrationalic to some extent is that, for a variety of reasons, we have come to overvalue the kinds of thinking skills that IQ tests measure and undervalue other critically important cognitive skills, such as the ability to think rationally.



What are some examples of a person exhibiting low rationality? What are some unexpected or lesser known "risks" of not thinking completely rationally?



In my book What Intelligence Tests Miss (2009), I begin with a discussion of David Denby (NY Times writer) and John Paulos (math professor) making disastrous person investment decisions. I also discuss how many feel that George W. Bush was dysrationalic.



But in terms of specifics, here are some irrational thinking tendencies to consider:





Is rationality something that's innate? Or is there hope for people with low RQ?



Rationality is not entirely innate. It is as malleable as intelligence and possibly much more so. Roughly one half of rationality as we define it is not process but knowledge — knowledge that could be acquired by perhaps 70% or more of the population.



Here’s what I mean: The model of individual differences in rational thought that West and I have put together partitions rationality into fluid and crystallized components by analogy to the Gf and Gc of the Cattell/Horn/Carroll fluid-crystallized theory of intelligence.



Fluid rationality encompasses the process part of rational thought — the thinking dispositions of the reflective mind that lead to rational thought and action. Crystallized rationality encompasses all of the knowledge structures that relate to rational thought.



These knowledge structures are the tools of rationality (probabilistic thinking, logic, scientific reasoning) and they represent declarative knowledge that is often incompletely learned or not acquired at all. But they can be learned by most and in this sense rationality is teachable.



Rational thinking errors due to such knowledge gaps can occur in a potentially large set of coherent knowledge bases in the domains of probabilistic reasoning, causal reasoning, knowledge of risks, logic, practical numeracy, financial literacy, and scientific thinking (the importance of alternative hypotheses, etc.).



Photo of Kahneman: Andreas Rentz/Getty Images for Burda Media.

This article was originally published on io9.com