Monday, October 27, 2014

Political Polarization and Confirmation Bias

Election Day is coming up a week from tomorrow, on Tuesday. Are you voting for your beliefs about the beliefs, character, and skills of the candidates? Or are you voting the party line, one more time? Here's an article I wrote for the Star Tribune newspaper in my home state of Minnesota. I have added weblinks below to several of the studies mentioned. 


"It's my belief and I'm sticking to it:
In such a polarized atmosphere, you may want to examine pre-existing biases."

By Timothy Taylor




Part of the reason American voters have become more polarized in recent decades is that both sides feel better-informed.

The share of Democrats who had “unfavorable” attitudes about the Republican Party rose from 57 percent in 1994 to 79 percent in 2014, according to a Pew Research Center survey in June called “Political Polarization in the American Public.”

Similarly, the percentage of Republicans who had unfavorable feelings about the Democratic Party climbed from 68 percent to 82 percent.

Most of this increase is due to those who have “very unfavorable” views of other party. Among Democrats, 16 percent of Democrats had “very unfavorable” opinions of the Republican Party in 1994, rising to 38 percent by 2014. Among Republicans, 17 percent had a “very unfavorable” view of the Democratic Party in 1994, rising to 43 percent by 2014.

A follow-up poll by Pew in October found that those with more polarized beliefs are more likely to vote. The effort to stir the passions of the ideologically polarized base so that those people turn out to vote explains the tone of many political advertisements.

A common response to this increasing polarization is to call for providing more unbiased facts. But in a phenomenon that psychologists and economists call “confirmation bias,” people tend to interpret additional information as additional support for their pre-existing ideas.

One classic study of confirmation bias was published in the Journal of Personality and Social Psychology in 1979 by three Stanford psychologists, Charles G. Lord, Lee Ross and Mark R. Lepper. In that experiment, 151 college undergraduates were surveyed about their beliefs on capital punishment. Everyone was then exposed to two studies, one favoring and one opposing the death penalty. They were also provided details of how these studies were done, along with critiques and rebuttals for each study.

The result of receiving balanced pro-and-con information was not greater intellectual humility — that is, a deeper perception that your own preferred position might have some weaknesses and the other side might have some strengths. Instead, the result was a greater polarization of beliefs. Student subjects on both sides — who had received the same packet of balanced information! — all tended to believe that the information confirmed their previous position.

A number of studies have documented the reality of confirmation bias since then. In an especially clever 2013 study, Dan M. Kahan (Yale University), Ellen Peters (Ohio State), Erica Cantrell Dawson (Cornell) and Paul Slovic (Oregon) showed that people’s ability to interpret numbers declines when a political context is added.

Their study included 1,100 adults of varying political beliefs, split into four groups. The first two groups received a small table of data about a hypothetical skin cream and whether it worked to reduce rashes. Some got data suggesting that the cream worked; others got data suggesting it didn’t. But people of all political persuasions had little trouble interpreting the data correctly.

The other two groups got tables of data with exactly the same numbers. But instead of indicating whether skin cream worked, the labels on the table now said the data was showing a number of cities that had enacted a ban on handguns, or had not, and whether the result had been lower crime rates, or not.

Some got data suggesting that the handgun ban had reduced crime; others got data suggesting it didn’t. The data tables were identical to the skin cream example. But people in these groups became unable to describe what the tables found. Instead, political liberals and conservatives both tended to find that the data supported their pre-existing beliefs about guns and crime — even when it clearly didn’t.

In short, many Americans wear information about public policy like medieval armor, using it to ward off challenges.

Of course, it’s always easy to define others as hyperpartisans who won’t even acknowledge basic facts. But what about you? One obvious test is how much your beliefs change depending on the party of a president.

For example, have your opinions on the economic dangers of large budget deficits varied, coincidentally, with whether the deficits in question occurred under President Bush (or Reagan) or under President Obama?

Is your level of outrage about presidents who push the edge of their constitutional powers aimed mostly at presidents of “the other” party? What about your level of discontent over government surveillance of phones and e-mails? Do your feelings about military actions in the Middle East vary by the party of the commander in chief?

Do you blame the current gridlock in Congress almost entirely on the Republican-controlled House of Representatives or almost entirely on the Democratic-controlled Senate? Did you oppose ending the Senate filibuster back in 2006, when Democrats could use it to slow down the Republicans, but then favor ending the filibuster in 2014, when Republicans could use it to slow down Democrats? Or vice versa?

Do big-money political contributions and rich candidates seem unfair when they are on the other side of the political spectrum, but part of a robust political process and a key element of free speech when they support your preferred side?

Do you complain about gridlock and lack of bipartisanship, but then — in the secrecy of the ballot box — do you almost always vote a straight party ticket?

Of course, for all of these issues and many others, there are important distinctions that can be drawn between similar policies at different times and places. But if your personal political compass somehow always rotates to point to how your pre-existing beliefs are already correct, then you might want to remember how confirmation bias tends to shade everyone’s thinking.

When it comes to political beliefs, most people live in a cocoon of semi-manufactured outrage and self-congratulatory confirmation bias. The Pew surveys offer evidence on the political segregation in neighborhoods, places of worship, sources of news — and even in who we marry.

Being opposed to political polarization doesn’t mean backing off from your beliefs. But it does mean holding those beliefs with a dose of humility. If you can’t acknowledge that there is a sensible challenge to a large number (or most?) of your political views, even though you ultimately do not agree with that challenge, you are ill-informed.

A wise economist I know named Victor Fuchs once wrote: “Politically I am a Radical Moderate. ‘Moderate’ because I believe in the need for balance, both in the goals that we set and in the institutions that we nourish in order to pursue those goals. ‘Radical’ because I believe that this position should be expressed as vigorously and as forcefully as extremists on the Right and Left push theirs.”

But most moderates are not radical. Instead, they are often turned off and tuned out from an increasingly polarized political arena.



Timothy Taylor is managing editor of the Journal of Economic Perspectives, based at Macalester College in St. Paul.