We live in an increasingly polarized society. How do we reverse this trend? My reflection on this topic keeps taking me back to the basic question raised in the sociology of knowledge: How do we know what we think we know? During college in the late 1970s, I read Peter Berger and Thomas Luckman’s classic The Social Construction of Reality. Their description of how we construct and reinforce reality with social interaction is one of the most important books I have ever read. It gave me a lifelong interest in the field.
If studying this field has taught me anything, coherence of ideas is not enough. I value coherence but we must be test ideas in the real world. And yet, the way I go about testing ideas will be influenced by the socially constructed reality in which I live. There is no complete escape from our psychosocial context, but we can stretch our understanding.
For these reasons, I relish the opportunity to discuss topics with people of differing perspectives. Unfortunately, many of the topics that most interest me are bristling with political implications. Civil conversation is difficult. Observations that challenge conventional understanding typically provoke derisive banter instead of substantive dialog. Dispassionate presentation of factual information with measured commentary does the same thing. No matter what I try, it is hard to keep dialog dispassionately focused on the substance. Why?
I think economist Timothy Taylor has some great insight. In his post Political Polarization and Confirmation Bias he writes:
Part of the reason American voters have become more polarized in recent decades is that both sides feel better-informed.
The share of Democrats who had “unfavorable” attitudes about the Republican Party rose from 57 percent in 1994 to 79 percent in 2014, according to a Pew Research Center survey in June called “Political Polarization in the American Public.”
Similarly, the percentage of Republicans who had unfavorable feelings about the Democratic Party climbed from 68 percent to 82 percent.
When you “feel” better informed, you tend to be more confident about your views and more dismissive of your opponent’s views. But are we truly better informed?
A common response to this increasing polarization is to call for providing more unbiased facts. But in a phenomenon that psychologists and economists call “confirmation bias,” people tend to interpret additional information as additional support for their pre-existing ideas.
One classic study of confirmation bias was published in the Journal of Personality and Social Psychology in 1979 by three Stanford psychologists, Charles G. Lord, Lee Ross and Mark R. Lepper. In that experiment, 151 college undergraduates were surveyed about their beliefs on capital punishment. Everyone was then exposed to two studies, one favoring and one opposing the death penalty. They were also provided details of how these studies were done, along with critiques and rebuttals for each study.
The result of receiving balanced pro-and-con information was not greater intellectual humility — that is, a deeper perception that your own preferred position might have some weaknesses and the other side might have some strengths. Instead, the result was a greater polarization of beliefs. Student subjects on both sides — who had received the same packet of balanced information! — all tended to believe that the information confirmed their previous position.
A number of studies have documented the reality of confirmation bias since then. In an especially clever 2013 study, Dan M. Kahan (Yale University), Ellen Peters (Ohio State), Erica Cantrell Dawson (Cornell) and Paul Slovic (Oregon) showed that people’s ability to interpret numbers declines when a political context is added.
In this second study, the exact same numbers were used to make the case for the efficacy of a skin cream and for the efficacy of gun control. In the former case, the respondents accurately interpreted the numbers but in the latter case they could not, claiming the numbers supported pre-existing understanding when clearly they did not.
Now stop!!! What are you thinking about this very second? If you are like most of me, you are likely thinking about personal experiences where you witnessed this in others. If you are a liberal, you are likely thinking of those Fox News watching Neanderthals denying climate change. Or if you are conservative, those bleeding-heart mush-heads who think the government can provide quality healthcare. If so, then you are missing the point! The issue is how you and I engage in confirmation bias? We all do it. Yet by definition, it is hard to detect because it happens at a subconscious level.
… But what about you? One obvious test is how much your beliefs change depending on the party of a president.
For example, have your opinions on the economic dangers of large budget deficits varied, coincidentally, with whether the deficits in question occurred under President Bush (or Reagan) or under President Obama?
Is your level of outrage about presidents who push the edge of their constitutional powers aimed mostly at presidents of “the other” party? What about your level of discontent over government surveillance of phones and e-mails? Do your feelings about military actions in the Middle East vary by the party of the commander in chief?
He lists other examples. Then this:
Of course, for all of these issues and many others, there are important distinctions that can be drawn between similar policies at different times and places. But if your personal political compass somehow always rotates to point to how your pre-existing beliefs are already correct, then you might want to remember how confirmation bias tends to shade everyone’s thinking.
When it comes to political beliefs, most people live in a cocoon of semi-manufactured outrage and self-congratulatory confirmation bias. The Pew surveys offer evidence on the political segregation in neighborhoods, places of worship, sources of news — and even in who we marry.
I would add two more observations. First, we do not hold our political views in a vacuum. We tend to associate with people similar to us and to build community on shared values. Our views become part of an integrated web of factors that give us identity, a sense of community, and give coherence to the world around us. The more deeply embedded we are in a community the more deeply reinforced is the validity of our positions. Because of this, changing our position on an issue is rarely just an intellectual exercise.
A change in a position can pose a significant existential threat with substantial consequences to our relationships and sense of well-being. Keep in mind that Americans today say they are less likely to marry someone of a differing political party than they are of a different religion. What would it mean to change your political views in such a marriage? Furthermore, it is one thing to learn that I have been using the wrong skin cream. It is another to find out that as a the compassionate, justice-embracing, person I believe myself to be, that the fair-trade coffee I have been enthusiastically promoting is little more than a marketing ploy or that an abstinence program I have championed has no impact on teen pregnancy. What does that do to my personal identity? Change of views has deeply personal and emotional consequences.
Second, I came across this article as I was preparing this post, Nonpolitical Images Evoke Neural Predictors of Political Ideology. The authors write:
Accumulating evidence suggests that cognition and emotion are deeply intertwined, and a view of segregating cognition and emotion is becoming obsolete. People tend to think that their political views are purely cognitive (i.e., rational). However, our results further support the notion that emotional processes are tightly coupled to complex and high-dimensional human belief systems, and such emotional processes might play a much larger role than we currently believe, possibly outside our awareness of its influence.
This is critical. When I initiate discussions about economics or demography, I very often get an emotional response. Why? Why do I respond like this? Sometimes it is because I do not have the time or the expertise to grasp what was said. I turn to heuristics as a shortcut, making intuitive assessments about what someone said based on experience in other contexts.
At an almost unconscious level, I reason from experience that someone who talks about topic X and uses certain phrases or reasoning patterns, also holds a collection of other viewpoints. I then surmise what a person is really getting at. I put that assessment through an emotional filter based on how I feel about this type of person. If my feelings are positive, then I congratulate her on a well-reasoned argument. If my feelings are negative, then I congratulate myself for being sensible and I go to work postulating how she became so silly or malicious. In either case, actual reasoning about the subject matter is minimal. The truth is that emotion figures into all our assessments and it is probably best to be a little more humble about our own reasoning abilities and less hard on emotional responses from others.
Taylor closes his piece with this:
Being opposed to political polarization doesn’t mean backing off from your beliefs. But it does mean holding those beliefs with a dose of humility. If you can’t acknowledge that there is a sensible challenge to a large number (or most?) of your political views, even though you ultimately do not agree with that challenge, you are ill-informed.
So Taylor has offered some thoughts about polarization and confirmation bias. I have added a couple of additional wrinkles. I appreciate Taylor’s call to focus first on the log in my own eye. I need to be more self-aware of my own proclivities and I could often have more humility. What else? How can we reduce polarization and confirmation bias? What do you think?