We believe what we believe - The psychology of persuasion

Writing to Persuade: How to Bring People Over to Your Side - Trish Hall 2019

We believe what we believe
The psychology of persuasion

We all cling to our ideas and beliefs, which makes it tough to change opinion. We all tend to think we’re smart and well informed, and we’re confident about our points of view. We resist other perspectives not because we are stupid and narrow-minded, but because there are fundamental psychological reasons for protecting our positions.

All of us, whether we’re conservatives or liberals, dislike having our positions challenged. It makes us uncomfortable, and we perceive it as a threat. We are better at finding the holes in other people’s arguments than at seeing the weaknesses in our own. Psychologists have found that we pay more attention to evidence that supports our beliefs, and we ignore evidence that goes against them—a tendency called confirmation bias. Because we seek out information that confirms what we already believe, our opinions tend to get stronger over time. We enjoy being around those who share our opinions.

We look for that community in real life and online. Paul Krugman is a wildly popular New York Times columnist and Nobel Prize—winning economist whose passionate readers comment frequently. Most of those readers agree with him—and most of the time, they are commenting because they want to be part of a community that thinks the same way.

Not only do we seek out information that supports what we believe, we put much more effort than might be considered rational into trying to rebut views that emotionally disturb us. When our confidence in those beliefs is challenged, we are likely to advocate even more strongly for them.

This tribalism, this pushing away of anything that seems intellectually threatening, might seem like an irrational way to approach the world, but it is not. We are more likely to survive if we share the opinions of a group, so confirmation bias would have provided evolutionary benefits. We cling to our views because they identify us to the outside world as being part of a group. By defining who we are and keeping us safe from outsiders, our opinions end up protecting and helping us.

Many of us take on the political coloration of our families and never change. Others take on the values and ideas of our peers. We all need to be part of a tribe, and when necessary, we will mask some of our beliefs if they put us at odds with our group. If my friend who belongs to a country club in a small town dominated by Republicans is open-minded enough to read the data on climate change, he will nevertheless remain silent at the postgame drink when his friends make fun of those who rant about rising seas. His peers might not believe in the urgency of climate change. If he remains silent, he loses nothing and possibly gains. More important, he isn’t doing anything to hurt efforts to deal with changing climate; if he appeared to disagree with his friends, though, he would be hurting himself.

Many have criticized how we all live in our own bubbles today, but there are unassailable reasons for people to hang around with others who feel as they do about the big issues of their day. There is safety in numbers, and it is psychological and sometimes even physical. (And if you don’t feel that you fit in with the group that you find at work, or even the family you were born into, the odds are good that you will move on and find people who are more compatible.)

When you establish commonality, you end up adopting the ideas of those who make up that community. Identifying with a group and trying to make it even larger also then increases its odds of becoming the ruling group in society. When we choose our friends, we are to a degree choosing our ideas, because we become some combination of all of them. If you think your friends support a certain policy—say that cutting taxes, by helping corporations, will end up helping all workers—you’re more likely to support that notion too. If there is an idea that you associate with people unlike you, then you will be more dubious of it.

Most people don’t base their opinions on detailed readings of experts who have studied an issue, although they might believe that they do. Our beliefs aren’t formed through a rigorous reading of information both for and against a particular issue. It’s much more emotional than that. The New York University psychologist Jonathan Haidt argues that we start with feeling and then find the reasons to support what we already believe.

Haidt, author of The Righteous Mind: Why Good People Are Divided by Politics and Religion (2012), has shown that familiarity lessens divisions. A student is more likely to become friends with the student whose dorm room is one door away than with the student whose room is four doors away. That matters in a larger, social way, because Haidt’s research also suggests that people who have at least one friend from the other political party are less likely to hate the supporters of that party. As people become more familiar to us, we tend to like them more. That even extends to furniture—you will like your new couch more the longer you have it—and to animals. The dog you at first found unappealing? Proximity leads to affection, and now you take pictures of her, admiring the odd way she crosses her back legs. We’re biologically hardwired to follow others. We like to think we’re individuals, but that’s not how we make decisions. We like to conform to what those in our tribe are doing and thinking. To avoid standing out as freaks. We assume what other people are doing is the correct behavior in a situation. It’s one of the ways we learn how to behave.

People are more likely to say yes when they see other people doing something. They might not even know them. It’s why we want to eat in the restaurant with the long line, and not the one with just a few people, and why the celebrity endorsement is powerful. A study by Noah J. Goldstein, Robert Cialdini, and Vladas Griskevicius showed that people were more likely to reuse their towels in a hotel and not request that they be washed every day when they were told that most of their fellow hotel guests had done that. Group influence is powerful. Even if you don’t know your fellow travelers in the hotel, just the fact that you’re all staying there puts you in the same group.

Most of us dislike uncertainty, and so we come up with explanations for whatever happens, and we hold on to them no matter what. People with a high need for closure make decisions quickly and resist more information. People with a low need for closure can accept much more ambiguity—and might have trouble making decisions. But whether we need closure may depend on circumstances. The social psychologist Arie Kruglanski has found that we all need more closure in times of stress and flux. For instance, he and his researchers saw that as the terrorism threat increased after September 11, support for President George W. Bush grew. People didn’t like feeling uncertain, and so they were more likely to agree with the president’s decision that something had to be done, even if that something was an attack against Iraq.

Ideology, once it takes hold, is powerful. In his book The Political Brain: The Role of Emotion in Deciding the Fate of the Nation (2007), the psychologist Drew Westen wrote about an experiment in which researchers took pictures of peoples’ brains as they watched videos of their preferred candidates in an election. In the videos, their candidate contradicted prior positions. As soon as the test participants realized the contradiction, the part of their brain that handles logic stopped working. They needed to keep out information that presented a problem. Other researchers have noticed our refusal to accept information that could be challenging. Holding fast to our positions makes people—well—a bit stupid. Dan Kahan, a professor of law and psychology at Yale, has found that both liberals and conservatives analyzed and answered problems regarding climate change in such a way that guaranteed they would get the answers that emotionally resonated with them. In effect, their greater intelligence and critical reasoning skill, rather than driving members of those groups to converge on the best scientific evidence, magnified polarization on whether human beings were contributing to global warming.

The depth of our desire to avoid other points of view was demonstrated in a series of studies led by Jeremy Frimer at the University of Winnipeg. In one study the researchers recruited 202 Americans online and divided them into two groups, those for and those against same-sex marriage. Participants were informed that if they chose to read information in support of their position and then answer questions, they would win $7.00. If they chose to read information against their position, they would win $10.00. Most people in this study were so eager to avoid the opposite point of view that they passed up the chance to win more money. Liberals and conservatives were equally strong in their desire to avoid information that might challenge their views. The scientists repeated the experiment with another 245 Americans and got similar results. And they found the same results with other issues. People just generally wanted to hear from people who agreed with them on questions ranging from important political ones to minor day-to-day decisions, like whether to have a Pepsi or a Coke.

Why do we stick so tenaciously to our views? It’s partly because we don’t want to try to hold two opposing beliefs in our mind. And it’s partly because we just don’t find it pleasant to acknowledge that we all hold such different ideas. People don’t like to regret their choices. It’s not in us. We don’t want to see evidence that a decision we made might have been mistaken, so we just choose not to see. No one wants to put aside their own ideas. It takes effort. At the same time, we don’t understand why other people don’t walk away from beliefs we find so obviously wrongheaded.

It’s much more psychologically comfortable to stick with our beliefs.

You don’t simply believe what you believe forever. People do change. We might ignore overwhelming evidence that a certain notion is wrong for a period of time that seems incomprehensible, but eventually we will no longer be able to bear the contrast between what we want to believe and what is clearly true. So we flip.

Even if you succeed in changing someone’s mind, don’t expect them to admit that they changed their beliefs. They will even lie to themselves to protect the sanctity of their belief, as voters did in the 1960s. John F. Kennedy barely won the popular vote for president. And yet, after he was assassinated, millions of people decided that they had voted for him—64 percent of the people told pollsters that Kennedy got their vote. Because their sense of the correct belief had changed, they rewrote their own history.