Economist Tyler Cowen argues that people too readily dismiss others’ arguments:
One of the most common fallacies in the economics blogosphere — and elsewhere — is what I call “devalue and dismiss.” That is, a writer will come up with some critique of another argument, let us call that argument X, and then dismiss that argument altogether. Afterwards, the thought processes of the dismisser run unencumbered by any consideration of X, which after all is what dismissal means. Sometimes “X” will be a person or a source rather than an argument, of course.
The “devalue” part of this chain may well be justified. But it should lead to “devalue and downgrade,” rather than “devalue and dismiss.”
“Devalue and dismiss” is much easier of course, because there then will be fewer constraints on what one can believe and with what level of certainty. “Devalue and downgrade” keeps a lot of balls in the air and that can be tiresome and also unsatisfying, especially for those of us trained to look for neat, intuitive explanations.
His George Mason Economics Department colleague Bryan Caplan responds that, in reality, most people aren’t dismissive enough:
I’m tempted to object, “Thank goodness for dismissal, because most ideas and thinkers are a waste of time.” But on reflection, Tyler’s overly optimistic. Dismissing ideas often requires rare intellectual discipline. Psychologists have documented our assent bias: Human beings tend to believe whatever we hear unless we make an affirmative effort to question it. As a result, our heads naturally accumulate intellectual junk. The obvious remedy is to try harder to “take out the trash” – or refuse to accept marginal ideas in the first place.
I think both Bryan and Tyler capture some of the truth. When new information or arguments cut against are strongly held preexisting views, we do indeed tend to dismiss it too easily. For example, political partisans tend to reject or devalue anything that reflects badly on their preferred party or ideology. They even misinterpret simple statistical data of a kind they find easy to grasp in other contexts.
On the other hand, when faced with arguments or data on issues that they know little about and don’t have strong opinions on, people are indeed overly creduluous, just as Bryan suggests. It’s psychologically easier to simply accept what you hear than to try to question it. Assent also requires less expenditure of time and effort.
Both effects are likely to be exacerbated in contexts where we have little incentive to try to restrain our cognitive biases because there is little payoff for getting at the truth. A paradigmatic example is the way voters process political information, where incentives for truth-seeking are weak because the chance that an individual vote will influence electoral outcomes is infinitesmally small. This point is consistent with the theory of “rational irrationality” that Bryan developed in his important book, The Myth of the Rational Voter (a big influence on my own work on political ignorance). As Bryan explains, a person who acquires information for reasons other than truth-seeking is unlikely to be disciplined and unbiased in her evaluation of it. For example, “political fans” often seek out information about politics because they find it interesting or because they enjoy reinforcing their preexisting beliefs. In many cases, however, this kind of bias will take the form of excessive dismissal rather than excessive assent.
If you want to be a better truth-seeker, you should indeed work harder at taking out cognitive trash when it comes to arguments about issues on which you don’t have strong views. But when it comes to issues where you do have strong beliefs, the more important danger to guard against is unjustified dismissal of opposing viewpoints.