Select Page

 By Ray Williams

February, 2022

 

There is a commonly held belief that the way to change people’s opinions or views is to present them with the facts and scientific evidence. Yet, recent research shows this strategy does not work.

Rebuttals can sometimes backfire, leading people to double-down on their original position. A new paper published in Discourse Processes suggests why: when people read information that undermines their identity, this triggers feelings of anger and dismay that make it difficult for them to take the new facts on board.

This partially explains why there is such a proliferation of fake news and false information on serious problems facing us.

The Influence of Fake News and Misinformation

Did Pope Francis endorse Donald Trump? No. And yet, millions shared this story on social media, and many believed it. Why? The proliferation of fake news. What’s fake news? Stories that are presented in such a way that they appear to be legitimate “headlines” but are totally made up by groups trying to sell a point of view without citations or supporting evidence. In 2016, Edgar Welch walked into a Washington DC pizzeria and opened fire because he thought Democratic Presidential nominee Hillary Clinton was running a child sex trafficking ring there. A fake story “informed” him about the conspiracy. ABC News reports that in a text message to his girlfriend that day, Welch wrote he had been researching the “Pizzagate” conspiracy theory and it was making him “sick.” In a text to a friend, Welch allegedly wrote that the “cause” was “raiding a pedophile ring, possibly sacrificing the lives of a few for the lives of many.”

Starting early in his presidency, Trump seized upon the words “fake news” and shaped them into a cudgel he incessantly wields. He has routinely tweeted against the “fake news” media when it has the temerity to fact-check a multitude of erroneous claims he has made; doled out “fake news” awards to outlets whose coverage he thinks is helplessly biased against him; and looked on as a series of autocrats and strongmen abroad aped his rhetoric, invoking “fake news” to argue away documented reports of ethnic cleansing, torture and war crimes.

The rapid rate of proliferation of a totally fake news story, or groundless claim, can make it go viral, turn it into a public pronouncement, and sway public opinion. Further, some appointees of Trump, and their kin, directly promoted these fake news stories and sites on Twitter. General Michael Flynn’s son, for instance, continued to tweet the PizzaGate story, after Welch went into the pizzeria and fired off several shots and was arrested. This constitutes irresponsible behavior at the very least. Flynn has himself promoted this and other claims without evidence.

When government officials like Flynn shared these baseless claims and fake news stories as fact, their proliferation can be even more rapid, as persons are even more likely to deem such claims as “reliable” or “trustworthy” when considering the source. Unable to evaluate the veracity of these tweets and posts, many persons can get “sick”, angry and outraged over what amounts to disinformation, or something totally made up. Right before the election, a Facebook user posted that Hillary Clinton “bathed in the blood of murdered children in Satanic rituals.”

Climate Change

Exposure to fake news about climate change may impact people’s belief in human-caused climate change and weaken their perceptions of the scientific consensus on climate change. New research from Arizona State University Assistant Professor Caitlin Drummond evaluates how a short exposure to fake news headlines affects people’s scientific beliefs and attitudes.

At the end of 2019 and beginning of 2020, the world was shocked by the raging Australian bushfires. But beyond the goodwill gestures and stories of bravery was a political battle about the causes of the fires. The non-believers in climate change attributed the fires to arson. Misleading stats even made their way into a UK House of Commons speech, prompting a group of scientists to write in the Guardian:The claim that arson is a primary cause of this season’s bushfires has been comprehensively debunked: fire officers report that the majority of blazes were started by dry lightning storms. Nevertheless, social media is awash with false claims about the role of arson, obscuring the link between climate change and bushfires.”

In 2008, Maxwell Boykoff, who is now a University of Colorado professor, published a study in the journal Climatic Change that looked at news programs on ABC, CNN, NBC, and CBS from 1995 through 2004. He found that 70 percent of the networks’ global warming stories “perpetuated an informational bias” by including the unscientific views of climate skeptics. In another study published in 2004, Boykoff looked at coverage in major newspapers from 1988 through 2002 and found that half of the 636 randomly selected articles gave roughly the same attention to skeptics’ arguments about the supposedly natural causes of climate change as they did to the scientific consensus that humans are warming the planet.

Despite clearly verified data supported by 97% of the world’s scientists, climate change deniers still refuse to acknowledge the truth, many of whom are in positions of power in government, politics and business.

COVID Deniers

An April 2020 article in the conservative National Post stated “in the midst of a global pandemic, conspiracy theorists have found yet another way to spread dangerous disinformation and misinformation about COVID-19, sowing seeds of doubts about its severity and denying the very existence of the pandemic.

Since March 28 conspiracy theorists — coronavirus deniers — have been using the hashtag #FilmYourHospital to encourage people to visit local hospitals to take pictures and videos to prove that the COVID-19 pandemic is an elaborate hoax. The premise for this conspiracy theory rests on the baseless assumption that if hospital parking lots and waiting rooms are empty then the pandemic must not be real or is not as severe as reported by health authorities and the media. This empty-hospital conspiracy theory joins a parade of false, unproven and misleading claims about the virus that have been making the rounds on social media including allegations that 5G wireless technology somehow plays a role in the spread of the COVID-19 virus, or consuming silver particles or drinking water with lemon prevents or cures you of the virus. None of these claims are true.

And then there’s the COVID Vaccine deniers and conspiracy theorists.

David Gorski writing the publication Science-Based Medicine explains: “One aspect of the COVID-19 pandemic that I haven’t really written much about much yet is the developing unholy alliance between COVID-19 deniers (who peddle in conspiracy theories and falsely claim that the disease isn’t that bad and/or that the lockdowns and social distancing are not—or no longer—necessary and should be lifted to alleviate the catastrophic damage to our economy that mitigation efforts have unavoidably caused), and the anti vaccine movement (which predictably peddles misinformation and conspiracy theories about how COVID-19 is being weaponized as a plot to impose forced universal vaccinationor even that the disease was created by Bill Gates for that very purpose, along with H1N1 influenza and Ebola!—or how the influenza vaccine supposedly makes one more susceptible to coronavirus; spoiler alert: It doesn’t).

Although it might seem odd, those of us who’ve studied conspiracy theories for a long time almost immediately realized that an alliance between the anti-vaccine movement and COVID-19 deniers would entirely natural and expected it. Both groups of conspiracy theorist share an intense distrust of government, particularly the CDC and FDA. Both share an equally intense distrust of big pharma, while glorifying individual freedom above all else, with anti-vaxxers invoking ‘health freedom’ and ‘parental rights’ and COVID-19 deniers invoking absolute bodily autonomy and the ‘right’ to do whatever they want, including violating social distancing. Both groups’ beliefs are rooted in conspiracy theories, with the central conspiracy theory of the anti-vaccine movement being that the government, big pharma, and the medical profession have evidence that vaccines cause autism and harm but are covering it up, and COVID-19 deniers’ beliefs based in one or more of several conspiracy theories (e.g., the virus is an escaped bioweapon; was created by Bill Gates to impose universal forced vaccination; isn’t nearly as bad as seasonal flu; and several more). Both have a tendency towards germ theory denial, laboring blissfully under the delusion that, because they are so ‘healthy,’ because they live such exemplary lifestyles, exercise, and eat the “right” foods, it makes sense that COVID-19 deniers don’t think that COVID-19 is a threat to them or their loved ones, just as anti-vaxxers don’t think vaccine-preventable diseases are a threat to them and their loved ones either.

The Impact of Misinformation Online

 Cailin O’Connor and James Owen Weatherall, writing in Scientific American argue that misinformation is effectively widespread because it is quickly shared by people with their friends and peers on social media platforms. People tend to trust information posted online by people we are affiliated with without going through a process of fact-checking. The authors state: “Putting the facts out there does not help if no one bothers to look them up. It might seem like the problem here is laziness or gullibility — and thus that the solution is merely more education or better critical thinking skills. But that is not entirely right. Sometimes false beliefs persist and spread even in communities where everyone works very hard to learn the truth by gathering and sharing evidence. In these cases, the problem is not unthinking trust. It goes far deeper than that.”

Gordon Pennycook and David S. Rand argue in their study published in Cognition that people with higher analytic thinking, regardless of their degree of partisanship, were less susceptible to fake news. They contend that we need to make better efforts to improve critical thinking skills of students and the public, as well as creating and supporting credible information sources for the general public.

Stephan Lewandowsky and colleagues argue in their article in Psychological Science in the Public Interest :”The widespread prevalence and persistence of misinformation in contemporary societies, such as the false belief that there is a link between childhood vaccinations and autism, is a matter of public concern. For example, the myths surrounding vaccinations, which prompted some parents to withhold immunization from their children, have led to a marked increase in vaccine-preventable disease, as well as unnecessary public expenditure on research and public-information campaigns aimed at rectifying the situation.”

The researchers asked the question “What causes the persistence of erroneous beliefs in sizable segments of the population? Assuming corrective information has been received, why does misinformation continue to influence people’s thinking despite clear retractions?”

The Societal Cost of Misinformation

It is a truism that a functioning democracy relies on an educated and well-informed populace The processes by which people form their opinions and beliefs are therefore of obvious public interest, particularly if major streams of beliefs persist that are in opposition to established facts. If a majority believes in something that is factually incorrect, the misinformation may form the basis for political and societal decisions that run counter to a society’s best interest; if individuals are misinformed, they may likewise make decisions for themselves and their families that are not in their best interest and can have serious consequences. For example, following the unsubstantiated claims of a vaccination-autism link, many parents decided not to immunize their children, which has had dire consequences for both individuals and societies, including a marked increase in vaccine-preventable disease and hence preventable hospitalizations, deaths, and the unnecessary expenditure of large amounts of money for follow-up research and public-information campaigns aimed at rectifying the situation.

Reliance on misinformation differs from ignorance, which is the absence of relevant knowledge. Ignorance, too, can have obvious detrimental effects on decision making, but, perhaps surprisingly, those effects may be less severe than those arising from reliance on misinformation. Ignorance may be a lesser evil because in the self-acknowledged absence of knowledge, people often turn to simple heuristics when making decisions. In other words, ignorance rarely leads to strong support for a cause, in contrast to false beliefs based on misinformation, which are often held strongly and with (perhaps infectious) conviction. For example, those who most vigorously reject the scientific evidence for climate change are also those who believe they are best informed about the subject.

A related but perhaps more surprising source of misinformation is literary fiction. People extract knowledge even from sources that are explicitly identified as fictional. This process is often adaptive, because fiction frequently contains valid information about the world. By definition, however, fiction writers are not obliged to stick to the facts, which creates an avenue for the spread of misinformation, even by stories that are explicitly identified as fictional. A study showed that people relied on misinformation acquired from clearly fictitious stories to respond to later quiz questions, even when these pieces of misinformation contradicted common knowledge. In other words, encountering misinformation in a fictional context led people to assume they had known it all along and to integrate this misinformation with their prior knowledge.

Governments and politicians

In the lead-up to the U.S.-led invasion of Iraq in 2003, U.S. government officials proclaimed there was no doubt that Saddam Hussein had weapons of mass destruction (WMDs) and was ready to use them against his enemies. The Bush administration also juxtaposed Iraq and the 9/11 terrorist attacks, identifying Iraq as the front line in the “War on Terror” and implying that it had intelligence linking Iraq to al-Qaida. Although no WMDs were ever found in Iraq and its link to al-Qaida turned out to be unsubstantiated, large segments of the U.S. public continued to believe the administration’s earlier claims, with some 20% to 30% of Americans believing that WMDs had actually been discovered in Iraq years after the invasion  and around half of the public endorsing links between Iraq and al-Qaida .These mistaken beliefs persisted even though all tentative media reports about possible WMD sightings during the invasion were followed by published corrections, and even though the nonexistence of WMDs in Iraq and the absence of links between Iraq and al-Qaida was eventually widely reported and became the official bipartisan U.S. position.

Politicians were also a primary source of misinformation during the U.S. health care debate in 2009. Misinformation about the Obama health plan peaked when Sarah Palin posted a comment about “death panels” on her Facebook page. Within 5 weeks, 86% of Americans had heard the death-panel claim. Of those who heard the myth, fully half either believed it or were not sure of its veracity. Time magazine reported that the single phrase “death panels” nearly derailed Obama’s health care plan.

Mainstream Media

First, the media can oversimplify, misrepresent, or overdramatize scientific results. Science is complex, and for the layperson, the details of many scientific studies are difficult to understand or of marginal interest. Science communication therefore requires simplification in order to be effective. Any oversimplification, however, can lead to misunderstanding.

Second, in all areas of reporting, journalists often aim to present a “balanced” story. In many instances, it is indeed appropriate to listen to both sides of a story; however, if media stick to journalistic principles of “balance” even when it is not warranted, the outcome can be highly misleading. For example, if the national meteorological service issued a severe weather warning for tomorrow, no one would—or should—be interested in their neighbor Jimmy’s opinion that it will be a fine day. For good reasons, a newspaper’s weather forecast relies on expert assessment and excludes lay opinions.

A major Australian TV channel recently featured a self-styled climate “expert” whose diverse qualifications included authorship of a book on cat palmistry. This asymmetric choice of “experts” leads to the perception of a debate about issues that were in fact resolved in the relevant scientific literature long ago.

After the invasion of Iraq in 2003, the American media attracted much censure for their often uncritical endorsement of prewar claims by the Bush administration about Iraqi WMDs, although there was considerable variation among outlets in the accuracy of their coverage, as revealed by survey research into the persistence of misinformation. Stephen Kull and his colleagues  have repeatedly shown that the level of belief in misinformation among segments of the public varies dramatically according to preferred news outlets, running along a continuum from Fox News (whose viewers are the most misinformed on most issues) to National Public Radio (whose listeners are the least misinformed overall).

The growth of cable TV, talk radio, and the Internet have made it easier for people to find news sources that support their existing views, a phenomenon known as selective exposure. When people have more media options to choose from, they are more biased toward like-minded media sources. The emergence of the Internet in particular has led to a fractionation of the information landscape into “echo chambers”—that is, (political) blogs that primarily link to other blogs of similar persuasion and not to those with opposing viewpoints. More than half of blog readers seek out blogs that support their views, whereas only 22% seek out blogs espousing opposing views, a phenomenon that has led to the creation of “cyber-ghettos”. These cyber-ghettos have been identified as one reason for the increasing polarization of political discourse.

So why doesn’t factual and scientific information convince people to change or abandon  their conspiracy theories or false beliefs?

Past research has suggested that one reason changing minds is so challenging is that exposing someone to a new perspective on an issue inevitably arouses in their minds the network of information justifying their current perspective. An arms race ensues: when the new complex of information overwhelms the old, often by integrating some of the existing information (yes, yoghurt contains bacteria, but bacteria can be helpful), persuasion is possible. If not, the attempt fails, or even backfires, as the old perspective is now burning even more fiercely in the person’s consciousness.

However, the new research led by Gregory Trevors  and his colleagues published Discourse Processes was motivated by the idea that the backfire effect may not be about which side is winning that mental arms race at all. Instead, these researchers believe the problem occurs when new information threatens the recipient’s sense of identity. This triggers negative emotions, which are known to impair the understanding and digestion of written information.

Trevors’ team tested their theory with a study on genetically modified foods – a subject rife with misconceptions, such as that hormones are involved in making them. The researchers assessed 120 student participants for their prior knowledge and attitudes to genetically modified organisms (GMOS) and their need for dietary purity, measured by items like “I often think about the lasting effects of the foods I eat.” This was the key variable of interest because it was intended to tap into how important food purity was to the participants’ sense of identity. The researchers specifically wanted to find out whether this identity factor would influence how people felt when their beliefs were challenged, and whether they would comply with, or resist, the challenge.

After the researchers gave participants scientific information worded to directly challenge anti-GMO beliefs, those with higher scores in dietary purity rated themselves as experiencing more negative emotions while reading the text, and in a later follow-up task, they more often criticised GMOs. Crucailly, at the end of the study these participants were actually more likely to be anti-GMO than a control group who were given scientific information that didn’t challenge beliefs: in other words, the attempt to change minds with factual information had backfired.

In further analysis, the researchers directly tested the claim that the identity factor had disrupted the learning of new pro-GMO information, but there was no evidence for this. Although negative emotions were weakly associated with lower post-test learning on a short quiz, participants at all levels of dietary purity performed at a similar (poor) level.

So we can reasonably conclude from this study that threats to a person’s identity do cause resistance to taking new factual arguments on board, and we know negative emotions seem to play a part, but we need more research to fully understand why this leads to a backfire effect.

If persuasion is most at risk of backfire when identity is threatened, we may wish to frame arguments so they don’t strongly activate that identity concept, but rather others. And if, as this research suggests, the identity threat causes problems through agitating emotion, we may want to put off this disruption until later: Rather than telling someone (to paraphrase the example in the study) “you are wrong to think that GMOs are only made in labs because…”, arguments could firstly describe cross-pollination and other natural processes, giving time for this raw information to be assimilated, before drawing attention to how this is incompatible with the person’s raw belief – a stealth bomber rather than a whizz-bang, so to speak.

According to a new study, 43 per cent of the US population believes wrongly that the flu vaccine can give you flu. In fact any adverse reaction from the vaccine, besides a temperature and aching muscles for a short time, is rare. It stands to reason that correcting this misconception would be a good move for public health, but the study by Brendan Nyhan and Jason Reifler published in Vaccine found that debunking this false belief had a seriously counterproductive effect.

The researchers looked at 822 US adults who were selected to reflect the general population in terms of their mix of age, gender, race and education. About a quarter of this sample were unduly concerned about the side effects of the flu vaccine. It is amongst these individuals, that attempting to correct the myth that the flu vaccine gives you flu backfired. The researchers showed participants information from the Center for Disease Control (CDC), which was designed to debunk the myth that the flu vaccine can give you flu. This resulted in a fall in people’s false beliefs but, among those concerned with vaccine side-effects, it also resulted in a paradoxical decline in their intentions to actually get vaccinated, from 46 per cent to 28 per cent. The intervention had no effect on intentions to get vaccinated amongst people who didn’t have high levels of concerns about vaccine side effects in the first place.

Why is it that as false beliefs went down, so did intentions to vaccinate? The explanation suggested by the researchers is that the participants who had “high concerns about vaccine side effects brought other concerns to mind in an attempt to maintain their prior attitude when presented with corrective information”. A psychological principle that might explain this behaviour is motivated reasoning: we are often open to persuasion when it comes to information that fits with our beliefs, while we are more critical or even outright reject information that contradicts our world view.

This is not the first time that vaccine safety information has been found to backfire. Last year the same team of researchers conducted a randomised controlled trial comparing messages from the CDC aiming to promote the measles, mumps and rubella (MMR) vaccine. The researchers found that debunking myths about MMR and autism had a similarly counterproductive result – reducing some false beliefs but also ironically reducing intentions to vaccinate.

Taken together, the results suggest that in terms of directly improving vaccination rates, we may be better off doing nothing than using the current boilerplate CDC information on misconceptions about vaccines to debunk false beliefs. If this is the case then the ramifications for public health are huge, but before we can decide whether this conclusion is accurate we’ll have to wait to see if the finding can be replicated elsewhere. History has taught us that when it comes to vaccines, acting on scant evidence can have catastrophic consequences.

In their book, The Enigma of Reason, cognitive neuroscientists Hugo Mercier and Dan Sperber, we use reasoning to justify beliefs we already believe in and use it to make arguments to convince others. They say that this facilitates social cooperation, but make not be effective in establishing the “truth.” And that’s because, as psychological scientists have shown, we are susceptible to a myriad of cognitive distortions, such as confirmation bias, in which we seek out the information that confirms what we already believe, and screen out conflicting information.

In politics, this is particularly relevant, as politicians often distort the truth, ignore facts and create fake news or misinformation to support their beliefs.

A number of studies document the many ways in which our political party distorts our reasoning. One study by Dan M. Kahan and colleagues published in Behavioural Public Policy, found that people who had strong math skills were only good at solving a math problem if the solution to the problem conformed to their political beliefs. The researchers found that those in the experiment who were defined as “Liberals” were only good at solving a math problem, for instance, if the answer to that problem showed that gun control reduced crime. In the same experiment the reseachers found that those participants were defined as “Conservatives” were only good at solving this problem if the solution showed that gun control increased crime.

In another study by D. N. Perkins and colleagues, published in Informal Reasoning and Education,  found that the higher an individual’s IQ, the better they are at coming up with reasons to support a position—but only a position that they agree with.

NYU psychology professor Jay Van Bavel explains the results of studies like these with his “identity-based” model of political belief: “Oftentimes, the actual consequences of particular party positions matter less to our daily lives than the social consequences of believing in these party positions. Our desire to hold identity-consistent beliefs often far outweigh our goals to hold accurate beliefs. This may be because being a part of a political party or social group fulfills fundamental needs, like the need for belonging, which supersede our need to search for the truth.”

Ignoring information we disagree with, particularly through social media, or watching only one biased news broadcast (read FOX News here or only surrounding ourselves with friends who agree with us can lead us to have narrow and event false beliefs.  A study by Serge Moscovici and Marisa Zavalloni published in the Journal of Personality and Social Psychology has that group discussions can lead people to hold more extreme beliefs than they would on their own—a phenomenon known as group polarization.

People May Change Their Views But Not Their Behavior

We would assume that in the presence of information that corrected mistaken beliefs, the individuals would alter their behavior. However, often it doesn’t. One study by Thomas Wood and Ethan found that even when people corrected their false belief that vaccines cause autism didn’t actually encourage some parents to vaccinate their children.  A study by Brendan Nyhan and colleagues published in Political Behavior found that correcting false beliefs about Donald Trump caused people to change their beliefs, but they were still prepared to vote for him anyway. In other words, while you can get people to understand the facts, the facts don’t always matter.

Elizabeth Kolber, writing in The New Yorker, recounted a Stanford University study in which the participants in the experiment were handed packets of information about a pair of firefighters, Frank K. and George H. Frank’s bio noted that, among other things, he had a baby daughter and he liked to scuba dive. George had a small son and played golf. The packets also included the men’s responses on what the researchers called the Risky-Conservative Choice Test. According to one version of the packet, Frank was a successful firefighter who, on the test, almost always went with the safest option. In the other version, Frank also chose the safest option, but he was a lousy firefighter who’d been put “on report” by his supervisors several times. Once again, midway through the study, the students were informed that they’d been misled, and that the information they’d received was entirely fictitious. The students were then asked to describe their own beliefs. What sort of attitude toward risk did they think a successful firefighter would have? The students who’d received the first packet thought that he would avoid it. The students in the second group thought he’d embrace it.

Even after the evidence “for their beliefs has been totally refuted, people fail to make appropriate revisions in those beliefs,” the researchers noted. In this case, the failure was “particularly impressive,” since two data points would never have been enough information to generalize from.

In Denying to the Grave: Why We Ignore the Facts That Will Save Us , Jack Gorman, a psychiatrist, and his daughter, Sara Gorman, a public-health specialist, examine what science tells us and what we tell ourselves. They point out that many persistent beliefs by people are not just false, but also potentially deadly to our health, such as the conviction that vaccines are hazardous. The reality is of course, that not being vaccinated is a danger to our health. “Immunization is one of the triumphs of modern medicine,” the Gormans note. But no matter how many scientific studies conclude that vaccines are safe, and that there’s no link between immunizations and autism, anti-vaxxers remain unmoved.

The Gormans argue that self-destructive ways of thinking have become adaptive and have  a physiological component. They cite research suggesting that people experience genuine pleasure—a rush of dopamine—when processing information that supports their beliefs. “It feels good to ‘stick to our guns’ even if we are wrong,” they observe.

The Gormans don’t just want to catalogue the ways we go wrong; they want to correct for them. There must be some way, they maintain, to convince people that vaccines are good for kids, and handguns are dangerous. (Another widespread but statistically insupportable belief they’d like to discredit is that owning a gun makes you safer.) They found in their research that providing people with accurate information doesn’t seem to help; they simply discount it. Appealing to their emotions may work better, but doing so is obviously antithetical to the goal of promoting sound science. “The challenge that remains,” they write toward the end of their book, “is to figure out how to address the tendencies that lead to false scientific belief.”

What Can Be Done About It?

Steven Sloman and Elke U. Weber, writing in the Behavioral Scientist, and also Cognition: International Journal of Cognitive Science, shared some psychological insights on how to improve political discourse and reduce bias and polarization. They make the point that changing peoples’ views involves changing societal norms. They say “providing accurate information does not promote clear, non-partisan thinking, because people use motivated cognition to deploy the fallacious reasoning that supports their beliefs.” They cite the work of Gordon Pennycook and David Rand who argue that the ability to distinguish real from fake news is governed by one’s ability to reason. People fall for fake news when they fail to engage in sufficient critical thinking. Hence, to help people recognize and reject misinformation, we should teach them (or nudge them) to slow down and think critically about what they see on social media.

Sloman and Weber also suggest that news broadcasters should act on the basis of the balance of evidence, rather than balance of opinion, when reporting scientific issues. All too often news reports present a false equivalent where false opinion is

People deploy motivated reasoning to use their cognitive processes in ways that supports their beliefs. There is considerable evidence that people are more likely to arrive at conclusions that they want to arrive at, but their ability to do so is constrained by their ability to construct seemingly reasonable justifications for these conclusions. This means that just providing people with more information and facts about a case, is unlikely to change their conclusion, if it is a long or deeply held belief. One possible solution in this regard is to structure conversations/discourses in such a way that facilitates/encourages people to come to these discussions with an open mind. For example- in deliberative democratic models, the group discussions are often held in private, which gives people more freedom to express their opinions freely, but also to keep an open mind as they do not fear immediate backlash from own group members, in case they change their stance. It is also helpful to probe each other by asking questions, not by dismissing a view or argument. Perhaps the most effective way is to force people to empathize by making them role-play as an advocate of the opposing side, or as an “unbiased expert”.

Do you not believe that if you present people with facts that are contrary to their deepest held beliefs they always change their minds? Me neither. In fact, people seem to double down on their beliefs in the teeth of overwhelming evidence against them. The reason is related to the worldview perceived to be under threat by the conflicting data.

Creationists, for example, dispute the evidence for evolution in fossils and DNA because they are concerned about secular forces encroaching on religious faith. Antivaxxers distrust big pharma and think that money corrupts medicine, which leads them to believe that vaccines cause autism despite the inconvenient truth that the one and only study claiming such a link was retracted and its lead author accused of fraud. The 9/11 truthers focus on minutiae like the melting point of steel in the World Trade Center buildings that caused their collapse because they think the government lies and conducts “false flag” operations to create a New World Order. Climate deniers study tree rings, ice cores and the ppm of greenhouse gases because they are passionate about freedom, especially that of markets and industries to operate unencumbered by restrictive government regulations. Obama birthers desperately dissected the president’s long-form birth certificate in search of fraud because they believe that the nation’s first African-American president is a socialist bent on destroying the country.

In these examples, proponents’ deepest held worldviews were perceived to be threatened by skeptics, making facts the enemy to be slayed. This power of belief over evidence is the result of two factors: cognitive dissonance and the backfire effect. In the classic 1956 book When Prophecy Fails, psychologist Leon Festinger and his co-authors described what happened to a UFO cult when the mother ship failed to arrive at the appointed time. Instead of admitting error, “members of the group sought frantically to convince the world of their beliefs,” and they made “a series of desperate attempts to erase their rankling dissonance by making prediction after prediction in the hope that one would come true.” Festinger called this cognitive dissonance, or the uncomfortable tension that comes from holding two conflicting thoughts simultaneously.

In their 2007 book Mistakes Were Made (But Not by Me), two social psychologists, Carol Tavris and Elliot Aronson (a former student of Festinger), document thousands of experiments demonstrating how people spin-doctor facts to fit preconceived beliefs to reduce dissonance. Their metaphor of the “pyramid of choice” places two individuals side by side at the apex of the pyramid and shows how quickly they diverge and end up at the bottom opposite corners of the base as they each stake out a position to defend.

Elizabeth Svoboda, writing in University of California Berkeley’s Greater Good, Our opinions are often based in emotion and group affiliation, not facts. Here’s how to engage productively when things get heated.

While it’s easy to conclude that people’s views are barometers of their moral elevation, the more nuanced truth is that a broad range of factors help explain deeply entrenched beliefs. Certainly, some partisans are focused on policy issues above all else. But for others, the brain’s tendency to stay the course may play a larger role. Psychological research suggests that once our minds are made up on important matters, changing them can be as difficult as stopping a train hurtling at full speed, even when there’s danger straight ahead.

Fortunately, research also hints at solutions—though you may need to change your mind about some things if you want to put these insights to work!

Why we resist facts

Most of us have a strong drive to hold on to pre-existing beliefs and convictions, which keep us anchored in the world. When your stance on controversial issues both cements your group identity and plants you in opposition to perceived enemies, changing it can exact a high personal toll.

“We are social animals instinctively reliant on our tribe for safety and protection,” says risk perception expert David Ropeik, author of How Risky Is It, Really? “Any disloyalty literally feels dangerous, like the tribe will kick you out. This effect is magnified in people already worried.”

Defection, in short, feels as terrifying as stepping off a window ledge—and to a certain extent, this fear is justified. When you think and behave in ways that separate you from members of your close community, you’re likely to experience at least some level of exclusion.

There’s a certain amount of plain old inertia at work, too. Researchers who study how people resolve cognitive dissonancethe uneasy feeling of holding inconsistent beliefs—note that most people would rather deny or downplay new, uncomfortable information than reshape their worldview to accommodate it. From that perspective, it’s less surprising that your friend whose behavior toward women is above reproach is more than willing to support politicians who’ve committed sexual assault.

Even lukewarm advocates can be resistant to updating their beliefs, since the very act of deciding between alternatives changes the way we evaluate each option.

One classic study had subjects look at an array of home gadgets and rate their desirability. After they had made a decision about which one to take home as a gift (say, the fluorescent desk light), their opinion of the item they’d chosen tended to rise, while their opinion of left-behind items soured.

In most situations, viewing your own choice through rose-colored glasses is a sensible way of ensuring you stay happy with your decision. But this outlook also skews your perception, meaning that even when you learn eye-opening new information, you may not feel alarmed enough to reconsider your views.

The backlash effect

When doubts do creep in, they can have a paradoxical effect, leading people to dig in their heels even more.

“The attacks against Trump have taught me something about myself,” one Donald Trump supporter told blogger and entrepreneur Sam Altman. “I have defended him and said things I really didn’t believe or support because I was put in a defensive position.”

Research bears out the idea that arguers’ outward insistence may be inversely related to their actual conviction. In one Northwestern University study, the less confident people felt in their opinions about hot-button issues (whether animal testing is OK, for instance), the more they labored to convince others of their chosen view.

If doubt often prompts people to double down rather than reflect, does that mean it’s futile to start a dialogue with those you disagree with? Typical debates, as you’ve probably discovered, aren’t all that effective—and if you start with the explicit goal of changing someone’s mind, you’re likely to get the opposite result. The reverse is also true: The less you try to force a particular set of views on someone, the freer they’ll feel to reflect honestly on what they think—and maybe even revise their thinking down the line.

Productive exchange is also more likely when there’s a mutual foundation of respect and friendship. In a project called Between Americans, Seattle-based author and artist Boting Zhang is chronicling the evolution of relationships between Trump supporters and Hillary Clinton supporters over the course of a year. Participants share with each other about more personal topics, but hot-button issues do pop up from time to time—and the strength of the participants’ bond often determines the conversation’s direction.

“You definitely need to know the other person as a person to want to stay engaged when things get controversial,” Zhang says. In exchanges about fraught topics, she advises starting from the assumption that you won’t change the other person’s views, but she admits that may be easier said than done. “That balance between caring deeply, yet seeking to listen rather than change someone’s mind, is a knife’s-edge balance!”

How to build bridges

Like the participants in Zhang’s project, you can strive to understand your conversation partners in ways that go beyond their views on controversial issues.

Talk to them about their early years, or about the biggest personal challenge they’ve faced. Their answers may give you unexpected insight into why they behave as they do—and perhaps make it easier for you to empathize with them, despite your misgivings about particular views they may hold.

When touchy topics do arise, try a non-confrontational approach, asking open-ended questions (“How did you feel when you heard about the U.S. pulling out of the Paris climate accords?”) or sharing your own experiences (“Someone groped me at work and no one believed me when I told them.”)

Whether you’re discussing the personal or the political, steer clear of language and behavior that signals contempt. Debaters tend to show contempt for their sparring partners in any number of ways—rolling their eyes, flinging personal insults, and deploying cutting sarcasm (witness these gambits on social media in spades). Psychologist John Gottman has identified this argumentative style as poisonous to close relationships, in part because it conveys a devastating message: “You, your thoughts, and your views are utterly beneath me.”

Cutting out contempt doesn’t mean tiptoeing around the issues: It’s healthy to lay out exactly how you differ with someone else and express your disappointment, or even devastation, with particular views they hold. The key is to remain in the debate zone rather than crossing the line into not-so-veiled disgust.

No matter how high-minded your intentions, it can be tempting to turn any dialogue on the issues into a game of one-upmanship.

But asking questions—and showing a genuine desire to hear and acknowledge the answers—sets a different tone that boosts the odds of a productive resolution, or at least a friendlier stalemate that inspires further thought and discussion. Persuasion that endures isn’t a one-sided sales job, but a fertile exchange—one in which your own thinking may evolve in ways you hadn’t expected.

Jeff Valdivia at the University of Queensland, in his article, “Why Facts, Alone, Don’t Change Peoples’ Minds in Our Big Public Debates”  describes how Twenty-five hundred years ago, Plato wrote an allegory about the relationship between rationality and emotion. In it, he described a chariot being pulled by two horses. The charioteer — the driver — represented intellect and reason, while the two horses represented our emotions. It’s challenging for us to steer the horses in the proper direction, Plato concluded, but it’s ultimately the driver — our rationality — that must do so.

At least in the West, this allegory is deeply ingrained in our cultures. We believe that it is up to our “rational minds” to overcome our “emotional minds” so that we can be good and upstanding people.

But, Valdivia asks, what if Plato’s allegory isn’t an accurate description of how our minds work? His allegory is compelling, but he didn’t have science as we know it today. And what science is teaching us is that the interaction between our rationality and emotions may not be so simple.

In his excellent book, The Righteous Mind: Why Good People Are Divided by Politics and Religion, Jonathan Haidt presents some of these scientific findings. Importantly, he shows how emotions play a critical role in decision-making.

Science has shown our brains constantly and instantly evaluate whether we like or dislike whatever we’re experiencing, which happens without any conscious thought at all. For example, social psychologist Robert Zajonc found study participants’ tended to prefer whatever they’d seen before, even if they weren’t consciously aware they’d seen it. He called this the mere-exposure effect.

In another set of experiments, researchers used a fart spray to elicit feelings of disgust in study participants. They then asked those participants a series of moral questions. They found that the people exposed to the bad smell more strongly condemned various actions than people who were not exposed to the smell. The researchers concluded that our emotional states act as information for our judgements — that is, our emotions tell us the degree to which we like or dislike something.

Does this make you think we’d be better off without emotions? Not so fast, says Haidt. Psychopaths reason but don’t feel emotions like the rest of us, and they’re severely deficient morally as a result. On the other hand, babies feel but don’t reason and have the beginnings of morality.

What should we take away from these findings? That emotions aren’t getting in our way — they’re the avenue through which we make decisions, both moral and otherwise. Neurologist Antonio Damasio found that patients with brain damage in areas that processes emotions often struggle to make even the most routine decisions.

We need emotions to function properly.The question is, how can we change minds when emotions seem so irrational?

The elephant and the rider

Jonathan Haidt builds on Plato’s allegory with his own and compares the relationship between rationality and emotions to a rider sitting atop an elephant. The rider represents rationality and the elephant, emotions. Although it seems like the rider is in charge and directing the elephant, the elephant has much more control over the situation than you might think.

Emotions, Haidt argues, are strong motivators of behavior. So, when the elephant moves, the rider often has no say in the matter. And, because we’re highly social creatures that need to be able to explain to others why we behaved as we did — like, why we didn’t do the dishes — the rider has evolved to expertly justify the actions of the elephant. In this sense, Haidt says, the rider often acts like the elephant’s press secretary!

So, is that it, then? Are we destined to be at the mercy of our emotions to guide our behavior? Is the role of rationality simply to explain our behavior to other people?

Thankfully, Haidt believes, the answer is “no”. Although our emotions do significantly sway our behavior, it’s not true that reason is a slave to the passions, as the philosopher David Hume once put it. Haidt believes that the influence can go both ways, even though the elephant wins out most of the time.