Why facts matter, even when they don’t - Tips on writing

Writing to Persuade: How to Bring People Over to Your Side - Trish Hall 2019

Why facts matter, even when they don’t
Tips on writing

If you force me to listen to facts that I don’t want to hear, I will probably reject them. If you try to make me look at information about my favorite candidate that seems to contradict previous stances, I’ll just let the logical part of my brain go dead and emotions will take over.

Our brains don’t like information that contradicts what we believe to be true. We also tend to remember information that matches the biases we already have. You want me to know that Hillary Clinton took millions in campaign donations from the finance industry even while promising to regulate Wall Street? I’ll just continue to support her and ignore your facts. You want me to know that Donald Trump hired many immigrants in his own company even as he proposed sharp reductions in immigration? I’ll just figure he had a good reason to do what he did.

We have a funny relationship with facts. We are more likely to share made-up stories on social media than stories that are indisputably true. One study by researchers at the Massachusetts Institute of Technology found that false news moves faster on Twitter than true news, because people prefer it and find it more interesting. They found that it took true stories about six times as long as false ones to reach 1,500 people. And this result applied to every subject area, from celebrity gossip to scientific findings.

Our internet free-for-all creates especially troubled times, because we all have megaphones for our ideas, whether real or fabricated. One astonishing incident was the supposed child abuse ring that Hillary Clinton was running out of a pizza parlor in Washington. Articles about the ring began appearing before the presidential election and, although quickly debunked, they continued to spread. Even false tweets by a made-up congressman were posted, claiming that the debunking was a lie. Fake articles linked the child abuse ring to a global pedophilia ring. The story rang very true to Edgar M. Welch, a young father from North Carolina. He drove six hours to the pizza joint, where he fired his assault rifle in an attempt to free the supposedly captive children. The police arrested him. And that—sadly for him—was true, as was the sentence of a four-year jail term.

That incident shows how tough it is to stamp out fake news when the facts align with what some people believe must be true. Some continued to cling to the story about Hillary Clinton’s child abuse ring by insisting that the man who had been arrested was actually an actor hired by the “mainstream media”—an insult meant to include newspapers and television stations seen as liberal.

Facts don’t change our minds when something seems right. But even though the web has enabled speedy sharing of false stories, lies are not new. When newspapers were thriving, many cities had at least two dailies—New York had many more. With ferocious competition for readers, press lords indulged in what was called yellow journalism, putting out made-up stories to attract readers and make money. Journalism became more respectable in the mid-twentieth century as advertising rather than money from readers became a steady source of revenue. Advertisers didn’t want to be associated with risky stories.

I feel like a sucker for only recently understanding that facts are not all they’re cracked up to be. Having spent my life gathering and evaluating facts, naturally I believed in the power of, well, truth. The year 2016 did me in, as it did others. After the British voted to leave the European Union and Americans elected Donald Trump president, the Oxford Dictionaries chose post-truth as the word of the year for 2016. Both elections had been filled with lies, but the voters didn’t seem to care. Sometimes facts even backfired, which wasn’t surprising to political scientists and psychologists who have studied this phenomenon. They know that we don’t respond as expected to when confronted with facts; and often, when someone challenges our views by pointing out errors, we not only don’t listen, we cling more strongly to the views we have.

As a journalist who came of age in the 1970s, with the inspiration of Woodward and Bernstein finding the truth and bringing down a president who had lied, I had a passion for facts and never wanted to make a mistake. Mistakes seemed so permanent. Once I wrote a story and it went into print, that was that. It would be there forever, in the library in a bound volume for people to read for years. There was no going back and fixing it. I also worked for places that stressed the importance of facts. Most of my friends who were reporters at the Wall Street Journal were terrified of ever having corrections, because if you had too many you would be quietly let go. The Times was a bit looser; it didn’t routinely fire people for having too many corrections, but corrections were a public humiliation.

In 2008 it turned out that the writer of a book who had been profiled in one of the sections I supervised was a fabricator. I was devastated and asked my boss if he wanted me to resign. I think he thought I was crazy. Many others, including all the reviewers of the book as well as the woman’s publisher, had been taken in by her stories. But that’s how deeply I have always felt the need to publish bulletproof facts. I never lost my terror of making mistakes, which I think was a good thing.

But even I have had to give a little. As digital journalism spread and pushed out print, and publications had to post continuously and quickly, it became clear that a rigid approach to every detail would hinder journalism’s survival. There were more mistakes; and although on the best sites they were corrected, people were forced to accept that journalism was not only the first draft of history, but a rough draft. Accuracy still matters to serious journalists, but no editor anywhere can be both speedy and perfectly accurate, so something has to give. Just as digital technology changed journalism, it transformed the culture. All of us have become journalists, in a sense—we share our truths on social media and style our food, our gardens, and our children for photographs on Instagram. Even when I’m just posting a picture of my white dahlia, I move the vase just so, to get the best angle and block out any distracting background. We all have audiences, and we are all publishing for them.

But we are presenting the truth of our lives in only the most limited way. We are showing a partial truth. It’s not that facts never matter; they do, when a jury, for instance, finds that someone violated the law and will go to jail for some number of years. But that truth coexists with a larger, more complex social “truth.” Our social truth is woven of lies—like the guy who can hardly stand his wife but keeps posting tributes to her on Facebook, presenting the image of a contented couple. Social media propels a personal definition of truth, so that now some people think that if they believe something could be true or ought to be true, it is.

Some people say we’re living in an atypical era when feelings are more potent than facts, but that’s a misunderstanding of how our brains function. Facts are always less important than feelings. What’s new is not our relationship with facts, but our ability to spew falsehoods to a bigger audience than just our neighbors. For decades, academics have shown our ability to disregard facts in coming to our conclusions. If we believe, for instance, that immigrants are more likely to commit crimes, every instance of such a crime solidifies our opinion. In case after case, media and politicians give outsized attention to crimes committed by immigrants: a recent event involved the murder of a college student in Iowa, allegedly by an immigrant from Mexico. The murder fits into the narrative of people who believe that immigrants, particularly those from Mexico and Central America, are a problem, and it will not matter how many times the public is told that immigrants do not commit more crimes than the native-born. When Mexicans are labeled as rapists, that notion sticks—even though their jobs in the United States, and their arrest records, more accurately indicate they should be called farmers.

It’s not realistic to think that if only people knew the truth, they would do the right thing. Although democracy rests on the idea that well-informed people will do a better job with government, and that knowledge and information lead to smarter decisions, educated people are no more likely than uneducated people to let facts influence them. Whether we’re liberals or conservatives or something else altogether, we go to great lengths to defend what we believe in by fending off the facts that challenge it. In numerous studies, researchers have presented study participants with accurate information on emotionally charged issues like stem cell research, tax reform, and the war in Iraq. They find that when those corrections go against what people already believe, they tend to have even more faith in erroneous facts. Essentially, corrections backfired and did not “cure” misinformation, because people want to protect their beliefs from facts that might do them harm.

People don’t like to admit they’re wrong. It’s a defense mechanism. It’s threatening to acknowledge that we believed something that is actually false. So we resolve that problem by clinging to our beliefs and ignoring the facts. We go out of our way to avoid facts that might put us in that uncomfortable situation. We read and watch information that supports our beliefs. It’s easy to be wrong about lots of things—like the idea that Barack Obama was not born in the United States, even though his birth certificate shows that he was. But if you don’t like the idea that a man with a name like Obama is president, then you find a way to ignore the facts of that case, and you will find plenty of other people online who will support your position by arguing that the birth certificate is phony. Since in theory anything can be phony, or made to seem phony through digital manipulation, each of us can cling to our own personal truth.

How could we as a species have made so much progress, evolving from hunters and gatherers to beings that can create computers and robots, when we are capable of ignoring obvious truths? Scientists believe that this tendency has a benefit and an evolutionary purpose. In The Knowledge Illusion: Why We Never Think Alone (2017), Philip Fernbach and Steven Sloman point out that individually, we all—well, most of us—know little. We depend on the knowledge of others to get through life. I don’t have to know anything about climate science to conclude that the planet is in danger, because I’m relying on information gathered by—or even just repeated by—those I trust.

This information-sharing process is efficient. I can save time by relying on information obtained by others, whether it’s true or false. That way each of us can specialize in what we do best, and thus the whole society progresses. Besides, we’re all naturally a little lazy and prefer to make decisions using as little information as possible. Most people don’t want to dig and dig and dig, ask around, read more. We want things to be simpler than that. We like an easy answer, and it doesn’t matter if that answer is based on a distortion.

Sometimes this refusal to acknowledge the validity of research and the unwillingness to personally evaluate evidence can have dangerous consequences, as when people decide not to vaccinate their children against diseases like measles. The concept of vaccines is scary, and there’s no reason to belittle the fear; you’re going to give me a bit of a deadly disease, and I’m supposed to think that’s just fine? In rare cases, vaccines do have side effects. People have feared them going back to the arrival of the smallpox vaccination in the 1800s in England and the United States. But without vaccinations, smallpox wouldn’t have been eradicated in those countries.

Our biological wiring also allows fake statistics to become embedded in our brains. The more we hear something, the more likely we are to believe it is true. And that belief has consequences in the world. Thanks to our fear of terrorism, we have as a nation spent more than $2 trillion on wars in the Middle East. But we have done nothing against Saudi Arabia, the country of origin of many of the terrorists who attacked the United States on September 11, 2001. We wanted action; we wanted it emotionally; but because we weren’t going to attack an ally, we had to attack someone, and the facts didn’t matter.

Americans are much more afraid of terrorism than we are of car accidents, even though car accidents are far, far more likely to kill us. And I say that as someone who was anxious after the 9/11 attacks. I was nervous on the subway. I was afraid the city would be attacked again. That fear wasn’t irrational; we had been attacked, and it made sense to be afraid of more. But the society-wide fear made the facts about who had carried out the attack somehow irrelevant and spawned a paramilitary society with a permanent army at perpetual war.

Our misplaced fears are intensified because politicians and media focus on the aberrant—it is, after all, more exciting—and so people fear the unusual to an extent that’s way out of proportion to the actual danger. Once you scare people, that scary thought will more easily come to mind. When we hear about a frightening event, our animal, emotional brains exaggerate its likelihood. The fear strengthens the memories, and so we overestimate the odds of unusual things and underestimate the odds of day-to-day risks.

Just hearing something again and again makes it lodge in your brain. When that happens, the idea is tough to shake. The Berkeley linguist George Lakoff explains that repetition activates the same neural structures repeatedly, and the more a neural structure is activated, the stronger it gets. By joining fear and repetition, it’s possible to string words together in such a way that they become linked in our brains, and it is tough to unlink them—as in “Crooked Hillary” or “radical Islamic terrorists.” Those who find a way to embed false ideas are smart because those opinions are represented in the brain by strong neural circuitry and are not likely to change easily, even when people are faced with evidence that their initial stance was based on faulty data.

Does this discussion leave you morose, suspecting that facts are never influential? If so, hold on. There are countless examples of the effectiveness of truth, whether in the hands of amateurs or professionals. Sometimes, “true” facts do become embedded in our brains, and sometimes “true” facts do displace false notions. One of my favorite stories of fact winning out over fiction involves the experience of some high school students in Pittsburg, Kansas, who dug into the academic and professional history of their new high school principal. They discovered that she had gotten her master’s and doctorate at a private college that was just a place to buy degrees. They published their story, and the principal resigned. The facts were irrefutable, and they won the day.

In your efforts to write something memorable and persuasive, look for surprising facts. Even one simple fact can be used to startle audiences and get many readers, as Fast Company did with a brief article explaining that the richest 1 percent of people on earth control more than half the wealth.

Facts can also alter behavior. Millions of people in the last forty years have started exercising and stopped smoking. Millions have started eating fresh green vegetables and stopped drinking sugary sodas. Soft drinks, a concentrated and regular source of sugar for many people, have been falling in popularity for more than a decade, so the information is sinking in.

But while facts can change behavior, they don’t usually change it on their own. They have to be paired with peer pressure, social norms, and emotional appeals. People see what they drink as a definer of their tribe and expression of their values. So now, rather than carrying cola in a plastic bottle, they might choose to carry a BPA-free water bottle. I grew up drinking diet sodas, and now I wouldn’t dream of having one. I think artificial sweeteners are bad for me; others might disagree, but that’s the fact I’m clinging to, and the one that changed my behavior.

Surprising facts can change minds, if they are surprising to the person who is hearing them. In a Times op-ed in 2018, as the battle over immigrant rights became more intense near the midterm congressional elections, Debbie Weingarten wrote that the U.S. government had denied the passport applications of her young children. She said they had been born at home in Arizona, delivered by a midwife. The government, apparently worrying that midwife births were just a way to forge documents for people born in Mexico, had refused to send the passports unless the parents sent more evidence of their citizenship.

That essay shocked me because I hadn’t heard about the discrimination against citizens delivered by midwives.

The writer said that in states bordering Mexico, the government was denying thousands of passport applications for children whose birth certificates stated they had been delivered by midwives in their homes. Would the children’s passports have been denied if their last name hadn’t been Hispanic?

There might have been news stories about this practice, there even might have been dozens; but it was new to me. I don’t know how this information will affect me going forward. But sometimes a fact can lodge in the brain and influence behavior for decades. An editor friend of mine remembers a long-ago article in The New Yorker by Michael Kinsley that changed the way he thought about polling. In the piece, Kinsley wrote about a poll that asked people whether they thought the United States was spending too much, too little, or just the right amount on foreign aid. The poll also asked them what they thought the government was spending. Most thought the government was spending far more than it actually was. That article, my friend said, showed him that there’s little value in asking people questions regarding matters they know nothing about, unless you’re trying to determine their level of knowledge. Because of that article, whenever someone proposes a poll as part of an article, he considers whether the people being surveyed will be responding out of ignorance. If they will be, he more often than not nixes the poll. Why ask people questions about something they know nothing about?

Your writing can influence people in a similar way if you do research that uncovers surprising facts—facts that will make your reader think, “Wow, there might be something here.” Don’t just say that it would be great to make politicians in the United States responsible for their actions. Dig around. Maybe you need some interesting details and facts from other countries that would help you suggest a different route here. For instance, politicians in Singapore get bonuses—or not—based on how well the economy performs.

Investigate the facts that underlie various points of view. If you research the opposition, you will be familiar with the evidence used to make that case. Find flaws in that evidence. This is where facts matter. You can find chinks in the opposition’s evidence. Think about how students on debate teams prepare. They don’t know until the last minute which side they will take, so they have to understand the claims supporting each side. Let’s say you are preparing to debate whether we should increase U.S. military force against Syria. There is a pro and a con to that question, and you can study each side in preparing your argument. Don’t assume that you understand the other point of view; you probably don’t.

Once you understand both sides, dive in and present your side, reaching the person with both emotional and factual arguments. When you have good evidence, it’s a lot easier to counter other people’s claims while supporting your own. Don’t repeat their falsehoods, because you don’t want to give them even more air time and allow them to strengthen their hold on your audience. Reframe the argument.

Even though facts have their limits, your facts must be right. Many people insist that the lies put out by Donald Trump prove that you can “get away” with lies indefinitely. But he is only persuading those who already agree with him. And people who grow to see that he is lying tend to become more critical of him and less likely to give him the benefit of the doubt. If your facts are wrong, you will turn off anyone who knows the subject you are writing or talking about. If you assert that more than half of the schoolchildren in the United States are reading below grade level, and one of your readers in the education field knows that the true number is closer to a third, you have lost a potentially influential reader right there. Because of that error, that reader will not believe anything else you have to say.

One of my most important jobs as the leader of the Times Op-Ed department was to assure that facts were checked. It surprised me to learn that the same set of facts could be used to reach opposite conclusions. A conservative and a liberal could use the same facts to make contrasting arguments. And that’s fine. But the facts did not change—only the perspective on them was different.

Some of my favorite people over the years I worked in journalism have been fact-checkers. Many of them move on to write novels or become editors; but when they are fact-checkers, they develop a rigorous approach to each word that is worth emulating if you want your facts to stand up under scrutiny. Fact-checkers look at words in a particular way. They aren’t, at least generally, worried about how a sentence sounds. They are obsessed with meaning and accuracy. At the Times, the magazine and Op-Ed were the heaviest users of full-time fact-checkers, because those departments used so many outside writers.

Use the techniques of fact-checkers. When you check your work, underline or put a check mark by each word or phrase as you confirm that it is correct. That way you won’t miss anything. You’d be surprised how many times you can look at a name, be convinced that it is spelled correctly, and turn out to be wrong.

Minimize your mistakes, and you will bolster your own credibility. Which is weird, I admit, given that academics have consistently found that people don’t hear facts the way you might think they do. But they still get annoyed when they know something is not true.

Part of using facts well is to respect them. Don’t use possibly false anecdotes to make your point. Don’t cherry-pick evidence. Understand that some of your readers will catch you out if you do. With all the iffy sources out there and the ability of the internet to spread lies and dubious claims, it can be challenging to be sure of your facts. In the next section you will find a list of guidelines for fact-checking, partly based on a list with examples prepared by Kevin McCarthy and Gita Daneshjoo, two of the fact-checking editors in Op-Ed when I was there.

Tips for Researching and Fact-Checking

Always look for reliable sources. We’ve all fallen victim to simple untruths that are relatively easy to check. At the Times, we did not even consider Wikipedia a source that could be trusted, even though it is worth starting there and using the footnotes as a point of departure. Here is a correction in the Times of a story with a small erroneous detail that led to an entertaining correction.

CORRECTION: An Op-Ed essay on Monday described bald eagles and ospreys incorrectly. They eat fish, and their poop is white; they do not eat berries and excrete purple feces. (Other birds, like American robins, Eurasian starlings and cedar waxwings, do.)

Look for several sources of each critical fact, not just one. You’re not guaranteed that something is true if you find it in a lot of places, but if you do and they tend to be credible sources, the information is more likely to be true.

CORRECTION: An Op-Ed article on Sunday about Arizona and immigration mistakenly suggested that javelinas are pigs. They are peccaries.

Look out for typos, transposed numbers, and mathematical errors. This is especially important in an argument that relies on numbers. A math mistake can bring you down in the eyes of the editor or professor who is reviewing your piece. Do not combine details from various scenarios to make one point, because you then risk inserting errors like this one that appeared in the Times:

While the writer did fly economy class from New York to Miami recently (on Delta), he was offered a cranberry-almond bar, not a Luna bar—nor blue potato chips and popcorn (which he was offered on a JetBlue flight to Mexico). In addition, the writer observed flight attendants distributing amenity kits to first- and business-class passengers on an American Airlines trans-Atlantic flight, not the flight to Miami.

Avoid random blogs, and rely as much as possible on academic research and government reports by nonpartisan research arms. Look at specialized sources, not just the first page that comes up in a search. Journalists pay to use LexisNexis, but anyone can use Google Scholar. You will get surprisingly different results. Do a search on Google of “persuasion and how best to change people’s minds,” and your first two results are a story from the Washington Post and one from Psychology Today. They are helpful—but Google Scholar looks in a different universe and presents you with an excerpt from a book by Howard Gardner and an article from the Journal of Consumer Research. Also look in Google Books, which will give you access to parts of books that might be relevant to your work.

If there’s an interesting fact sourced to an academic article, read the article, not just the summary. Or interview the author. That way you will protect yourself from misinterpreting something critical in the study that might be the basis for your argument.

CORRECTION: An opinion essay on May 13 about ethics and capitalism misstated the findings of a 2010 study on psychopathy in corporations. The study found that 4 percent of a sample of 203 corporate professionals met a clinical threshold for being described as psychopaths, not that 10 percent of people who work on Wall Street are clinical psychopaths. In addition, the study, in the journal Behavioral Sciences and the Law, was not based on a representative sample; the authors of the study say that the 4 percent figure cannot be generalized to the larger population of corporate managers and executives.

If someone has a financial interest in something, be dubious.

If something sounds like it can’t possibly be true, be suspicious.

Aggressively question facts you agree with, not the ones that go against your biases. You are more likely to believe what you already agree with. Distrust your tendency to believe people who are “on your side,” and look for facts from a variety of sources.

Use stories to enliven your writing, but be wary of generalizing from one fact. Remember that the story of one person—an anecdote about, say, a woman who could not get an abortion because of protesters outside of a clinic—is just one fact. Ideally, you will take data from peer-reviewed studies that looked at large numbers of people. Look for consensus among the experts in a field. If most of them agree on something, the odds are you can use that fact.

______

Remember that we human beings are contradictory. We like to share stories of dubious accuracy, but we also love catching people out when they get something wrong. Be careful, too, in your headlines and the other text you use to sell your argument. Some people clearly don’t care much about accuracy, but studies show that you can hurt your reputation by distorting the truth and trying to manipulate people. So in your search for facts, avoid misleading headlines and the understandable impulse to go for clicks and only clicks. People don’t like to be fooled. Ultimately it doesn’t pay to use little white lies.

Eventually facts can have an effect, but it often happens after the fact. Americans continued to believe that Iraq had weapons of mass destruction even after that was proven wrong. They wanted to believe their president. Over time they began to doubt the story of the weapons, but they found other reasons to justify their country’s presence in Iraq, or they simply decided to oppose the war.

When the emotional power of something fades a little, people are more likely to be open to a different reality. But that takes time. Meanwhile, be sure you are the one who is writing with passion, power, and facts that no one can dispute.