Blindness - On Blind Tribes and Becoming Sighted

Sex, Power, and Partisanship: How Evolutionary Science Makes Sense of Our Political Divide - Hector A. Garcia 2019

Blindness
On Blind Tribes and Becoming Sighted

Psychological researchers have demonstrated that we are far from natural Bayesian reasoners—that is, rational thinkers who start off with a hypothesis, then update our level of acceptance or rejection of that hypothesis based on incoming evidence. Rather, our thinking is clouded by an expansive array of biases that distort external reality. Of particular interest for political psychology is a kind of bias called motivated reasoning—the tendency to use reasoning strategies, such as rebutting factual information (or simply ignoring it), in order to arrive at a prior or emotionally preferred conclusion. Typically, this evading of information to self-delude is performed outside of conscious awareness, and is used to fend off negative emotions.

A growing body of research is revealing that voters’ decisions to support a given candidate or policy descend easily into the fog of motivated reasoning.1 Instead of updating one's position upon receiving new information, the voting populace deflects information in order to support emotionally preferred political stances. Indeed, research suggests that Donald Trump may have been onto something, figuratively speaking, when at a campaign rally in Iowa during the 2016 run for president he bragged, “I could stand in the middle of Fifth Avenue and shoot somebody and I wouldn't lose any voters.”2

In one study conducted before the 2004 US presidential election, researchers presented highly partisan Republican and Democratic subjects with (partly fictional or edited) scenarios depicting either George W. Bush or John Kerry clearly contradicting their prior positions.3 For instance, a quote attributed to George W. Bush about Enron—an energy company run by CEO Kenneth Lay that became synonymous with corporate corruption when in 2001 its fraudulent financial practices were revealed—read,

First of all, Ken Lay is a supporter of mine. I love the man. I got to know Ken Lay years ago, and he has given generously to my campaign. When I'm President, I plan to run the government like a CEO runs a country. Ken Lay and Enron are a model of how I'll do that.

These quotes were followed by contradictory stances, such as, “Mr. Bush now avoids any mention of Ken Lay and is critical of Enron when asked.” Following this information, the researchers had subjects rate how much they felt the candidates had contradicted themselves and how much they agreed with an exculpatory statement. For example, “People who know the President report that he feels betrayed by Ken Lay, and was genuinely shocked to find that Enron's leadership had been corrupt.” Perhaps not surprisingly, the researchers found that both liberal and conservative subjects were less likely to agree that their candidate self-contradicted, and more likely to agree that the rival candidate did. They were also more likely to agree with statements that gave their preferred candidate a pass on his inconsistency. However, what is even more interesting about this study is that it was conducted in an MRI machine. The researchers found that while engaging in motivated reasoning, subjects’ brains were not activated in the regions associated with “cold” reasoning, or reasoning relatively free from emotional content, but in those parts associated with processing the experience of punishment, pain, fear, and the appraisal of threatening information.

One real-world example of motivated reasoning occurred when in 2018 it was revealed that Donald Trump had an affair with porn star Stormy Daniels shortly after his third wife, Melania, gave birth to their son, and that Trump (or, reportedly, his attorney) paid Daniels $130,000 to keep her quiet. Instead of rebuking Trump for so flagrantly violating such an assortment of cherished Christian values, Evangelicals widely gave Trump a pass.4 Even Tony Perkins—head of the Family Research Council, an Evangelical nonprofit that has spoken out against the human papillomavirus (HPV) vaccine on the basis that it gives women license to engage in premarital sex,5 and suggested to the Justice Department that the availability of pornography in hotels violates obscenity laws6—said only, “All right, you get a mulligan [a golf reference, where a player is allowed to replay a stroke]. You get a do-over here.”7

It is easy to see the moral hypocrisy in such scenarios without fully appreciating its underlying psychology. Here is where evolutionary theory can help us make better sense of our political biases. Motivated reasoning is a way to avert threatening emotions. A basic evolved function of emotions is to mediate our interactions with the environment. For example, if we see a poisonous snake, we may experience fear, which helps us to avoid being bitten. The problem with relying on emotional reasoning, however, is that the emotion centers of our brain are ancient, often outdated, and prone to false-positive appraisals. As noted in chapter 2, snake phobias persist at relatively high base rates even in environments where the probability of encountering a poisonous snake is practically nil.

But here the fear is not about snakes. The fear-driven motivated reasoning we see in politics is all too often tied to the instinctive need for tribal belonging. In the days of our distant ancestors, being rejected by the group would have been a death sentence. To put the need for the group into perspective, imagine how long it would take for the ravages of nature to kill you if you were dropped off naked and alone in the middle of the Serengeti. Moreover, the unity of the tribe was also necessary to survive other tribes. But today we carry our tribalistic psychology over into politics, despite living in increasingly interconnected societies in which insular thinking has arguably become more of a liability than an asset. And we have known for some time that group thinking can distort basic realities.

In a classic 1951 study, social psychologist Soloman Asch showed us just how much our brains can be influenced to conform to group consensus.8 Working in the aftermath of World War II, when much of the world was struggling to make sense of how ordinary people could have perpetuated the horrors of the Holocaust, Asch became interested in understanding the impact of social pressure to conform. He ran an experiment based on a visual task, presenting subjects with two cards. On the first card there was one black line. On the second card there were three black lines, one of which was obviously the same size as the line on the first card, the other two obviously different. The subjects’ task was to match the line on the first card according to length with one of the three on the second card. Easy enough.

But at this point, Asch sat eight men in a circle. Seven of those men were confederates instructed to match up the wrong lines. Asch arranged for the real subject in this experiment to always rate last, which meant that he was regularly put in the position of having to directly contradict the seven men before him. What Asch found is that while a majority of subjects (68 percent) responded correctly in the face of confederate mismatching, an astonishing percentage (32 percent) did not. When interviewed afterward, some “independent” subjects explained the social pressure to conform: “I do not deny that at times I had the feeling: ’to go with it, I'll go along with the rest.’” One conforming subject replied, “I suspected about the middle [line]—but tried to push it out of my mind.” Asch concluded that those who understood they were wrong yielded due to an overwhelming need to not appear different or defective in the eyes of the group. Remarkably, a minority of the conforming subjects was completely unaware that the confederates’ answers were incorrect.

Importantly, Asch's findings suggest that conformity can be an unconscious impulse and that it can blind us to reality. His work also shows how at other times conformity may arise as a result of emotional pressure, all of which suggest that conforming to the group may have been important to survival in our evolutionary past. More research shows how our tribal blinders extend seamlessly into our political identities, and how easily we dispense with our deeply held convictions in order to belong.

For example, in one study, researchers presented highly partisan liberals and conservatives with two fabricated newspaper reports on welfare programs.9 One “program” was exorbitantly generous for the time, offering families with one child eight hundred dollars per month, two hundred dollars for every additional child, housing and daycare subsidies, job training, two years’ paid tuition at a community college, and two thousand dollars’ worth of food stamps. Another program, far more stringent than the first, was also presented to subjects—$250 monthly, fifty dollars for each additional child, partial medical insurance, and an eighteen-month limit with no possibility of reinstating aid. The researchers queried which of the programs subjects supported. Given what we have learned, you may already have some ideas about who supported which policy. However, before subjects rated their support, they were either told that House Republicans (or Democrats) strongly endorsed either of the two welfare policies (they were also told that the rival party rejected the policy). What the researchers found was that if subjects believed their political tribe supported the policy, they too supported it, even when it went against the well-established ideological stances of their respective party and presumably their own. In other words, liberals tended to support the stringent welfare policy if they believed House Democrats backed it, and conservatives supported the lavish welfare policy when told House Republicans backed it. Another experiment in the same study found that after presenting subjects with similar scenarios, and getting similar results, subjects reported believing that their own perspectives on government influenced them the most and that the perspectives of the lawmakers influenced them least, despite going with the group in a way that so obviously countered their values. Put more simply, the subjects were blind to their own tribalistic blindness.

Once again, ancient dangers shape contemporary fears, and the primeval risks of alienating the tribe underlie our contemporary political stance-taking. Conversely, conforming to group norms and behaviors demonstrates loyalty to the group, facilitates cooperation, and serves to inoculate against rejection by one's own clan. Essentially, our fraught history living in violently competing tribes makes it feel good to go with the group and terrible to go against it. Moreover, there appears to have been an advantage in such a lifestyle to blocking out information that would jeopardize our standing in the group, however factual, and to simply going with the momentum of the clan, however corrupt. However, it is fair to say that anytime an edifice of civilization has collapsed from the inside, our insular tribalistic psychology played a central role in eroding its pillars. But are there solutions? Does education hold the key? Let us consider this possibility.