Even Mathematically Literate People Become Innumerate when they Focus on Political Issues

At Mother Jones, Kevin Drum and Chris Mooney have interesting posts discussing a new paper by Yale law professor Dan Kahan and his coauthors, which finds that even people who are generally good at interpreting statistics act as if they are innumerate when faced with data that goes against their political views.

Mooney summarizes the results as follows:

The study…. has an ingenious design. At the outset, 1,111 study participants were asked about their political views and also asked a series of questions designed to gauge their “numeracy,” that is, their mathematical reasoning ability. Participants were then asked to solve a fairly difficult problem that involved interpreting the results of a (fake) scientific study. But here was the trick: While the fake study data that they were supposed to assess remained the same, sometimes the study was described as measuring the effectiveness of a “new cream for treating skin rashes.” But in other cases, the study was described as involving the effectiveness of “a law banning private citizens from carrying concealed handguns in public.”

The result? Survey respondents performed wildly differently on what was in essence the same basic problem, simply depending upon whether they had been told that it involved guns or whether they had been told that it involved a new skin cream….

[H]ow did people fare on the handgun version of the problem? They performed quite differently than on the skin cream version, and strong political patterns emerged in the results—especially among people who are good at mathematical reasoning. Most strikingly, highly numerate liberal Democrats did almost perfectly when the right answer was that the concealed weapons ban does indeed work to decrease crime (version C of the experiment)—an outcome that favors their pro-gun-control predilections. But they did much worse when the correct answer was that crime increases in cities that enact the ban (version D of the experiment).

The opposite was true for highly numerate conservative Republicans: They did just great when the right answer was that the ban didn’t work…, but poorly when the right answer was that it did….

Many previous studies show that people are highly biased in their evaluation of political information, overvaluing anything that supports their preexisting views, while discounting or ignoring data that cuts the other way. Kahan’s study adds to this literature by focusing specifically on statistical reasoning. In addition, Kahan shows very effectively that the same people do a better job of reasoning when focusing on nonpolitical issues (such as the effectiveness of a skin cream) than on political ones (such as gun control).

Why the difference? The answer is that unbiased interpretation of data is costly – even on the part of people who are good at it. In addition to the time it takes to evaluate the data properly, there is also the psychological pain we suffer if the data conflicts with cherished preconceptions. In nonpolitical contexts, we are usually looking at data for purposes where our decisions make a real difference. For example, we might be deciding whether to buy a given car, television – or skin cream. The consumer who does a poor job of reasoning could easily end up with a lemon, bad picture quality, or a skin rash. That gives consumers at least some serious incentive to gather information and keep an open mind in evaluating it, even if they are still far from perfect. When it comes to politics, by contrast, the incentives are extremely weak. The voter who does a great job of evaluating data on the effectiveness of gun control has only an infinitesmal chance of actually influencing the outcome of an election in favor of the candidate with the better policies on the subject. Thus, few people devote more than minimal time and effort to either acquiring political information or analyzing it in an unbiased way.

This kind of bias complicates efforts to alleviate political ignorance by improving education. Even if voters learn more information, they are still likely to do a poor job of using it.

Voters are not completely impervious to new data. If the evidence is striking and unambiguous, some people will change their minds over time. Witness the decline of racism from the 1920s to the 1960s, as moderate whites gradually realized that blacks are not inferior to whites, and that giving them equal rights does not cause any great harm. The dramatic 9/11 attacks led the voters who had previously ignored it to take the threat of radical Islamist terrorism more seriously (though many then began to err in the opposite direction of overestimating the danger). In general, however, people do a poorer job of evaluating information rationally when they act as ballot box voters than when they “vote with their feet” in the private sector, or in choosing which jurisdiction to live in.

UPDATE: Bryan Caplan has some additional thoughts on the study.

Powered by WordPress. Designed by Woo Themes