Politics, our mainstream and social media are full of people who believe their lies and propagate them readily.
Elizabeth Holmes, a biotech entrepreneur, was named the richest and youngest self-made female billionaire in 2015. She has subsequently convicted of fraud and sentenced to 11 years in prison, and her company Theranos collapsed. Then there is Anna Sorokin, also known as Anna Delvey, who stole hundreds of thousands of dollars from New York’s high society while posing as a German heiress. And Shimon Hayut, also known as Simon Leviev, the infamous Tinder Swindler.
Finally, there is former Republican President Donald Trump and his associates who have lied repeatedly about the 2020 Presidential election being “stolen,” by fraud by the Democrats.
It’s not merely the falsehoods they told others, but also the ones they must have told themselves, that characterize each of these individuals. Despite all odds, they each thought their behavior was somehow acceptable and that no one would ever find out. They often displayed a personal denial of reality while enlisting others in their schemes.
You might think that this kind of behavior is confined to a few extreme circumstances and is a rather uncommon phenomenon. However, self-deception is extremely widespread and may have developed to have certain advantages for the individual. To retain our self-images, we deceive ourselves, which enables us to commit immoral acts with a clear conscience. The most recent research suggests that self-deception may have even evolved to assist us in convincing others. If we begin to accept our lies, it becomes much simpler to persuade others to do the same.
Beyond the current frauds that made headlines, this research may help to explain dubious behaviour in many different spheres of life. Understanding the various variables that contribute to self-deception can help us recognize when it may be influencing our judgment and help us avoid making mistakes as a result of these delusions.
Preserving One’s Ego
Any psychologist will tell you it’s difficult to do a scientific study on self-deception. Since it happens subconsciously, you can’t just ask someone if they’re kidding themselves. The experiments are hence frequently rather complex.
Research done by Yale University associate professor of marketing Zo Chance studied this issue. She demonstrated that many people unintentionally use self-deception to enhance their egos in a brilliant experiment from 2011 that demonstrated this, which was published in Psychological and Cognitive Sciences.
An IQ test was administered to one group of participants, and the results were printed at the bottom of the page. As you might imagine, these individuals outperformed a control group who did not have access to the solution key. However, they didn’t seem to realize how much they had relied on the “cheat sheet” because they projected they would perform just as well on a second test with 100 additional questions but no answer key. They had managed to convince themselves that they already knew the answers to the issues and didn’t require assistance.
Chance conducted the entire experiment again with a new group of subjects to confirm this finding. To counteract overconfidence, the participants were this time given a monetary prize for correctly guessing their outcomes in the second test. If the participants were aware of their actions, you may anticipate that this incentive would make them less overconfident.
In actuality, it did little to deflate the participants’ inflated sense of their intelligence; even when they were aware that they would lose money, they continued to delude themselves into believing they were smarter than they were. This shows that the ideas were real, firmly held, and unexpectedly resilient.
It’s simple to imagine how this may be used in the real world. A student who cheated on an exam may feel they earned their position at a prominent university, while a scientist who used false data may feel their results were accurate.
A variety of other situations have been noted where self-deception has been used to improve self-image.
For instance, economics professor Uri Gneezy of the University of California, San Diego, recently demonstrated that it can assist us in defending potential conflicts of interest in our employment. He published his findings in the journal Games and Economic Behavior.
Gneezy requested participants to assume the roles of investment advisors or clients in a 2020 study. The advisers were presented with two distinct options to weigh, each of which had distinct risks and rewards. They were also informed that if the client chose one of the two investments, they would be paid a commission.
In one set of experiments, the advisers were informed of this potential benefit right at the outset of the study, before they even began to weigh their options. While they claimed to be making the best decision for the client, they were far more likely to select the one that would benefit them personally.
However, in the remaining trials, the advisers weren’t informed of this potential incentive until they had had some time to assess the benefits and drawbacks of each. This time, fewer people decided to let the reward affect their choice; instead, they stayed true to their mission of providing the client with the greatest counsel.
According to Gneezy, the fact that the participants’ decisions were only affected by their awareness of the personal benefits in the first scenario suggests that their self-deception was unconscious; it altered the way they calculated the risks and benefits without their awareness of the bias, allowing them to believe that they were still acting in the client’s best interests. It would have required a total change of heart in the second case, which would have been more difficult to defend themselves. They simply couldn’t persuade themselves that doing so would be moral, he claims.
Persuading Ourselves, Persuading Others
According to Gneezy, self-deception is a strategy for defending our moral sensibilities. Even when our actions would imply otherwise, he argues, “It means that we may keep seeing ourselves as good people.”
The most obvious application of this type of self-deception may be for financial advisors, but Gneezy believes it may also be significant for the private healthcare industry. Despite having the best of intentions, a doctor may unwittingly convince themselves that the more expensive course of therapy is best for the patient – without even realizing it, the author claims.
The impact of self-deception on our interactions with others may be the most unexpected.
This idea contends that self-deception makes us more convincing by enabling us to speak with greater assurance. If you truly think a product is a high-quality bargain, even if there is evidence to the contrary, you will make a superior case for it if you are attempting to sell it.
A recent research study published in Nature Human Behavior by Peter Schwardmann, an assistant professor of behavioural economics at Carnegie Mellon University in the US, offers some compelling support for this notion, which was first put forth many years ago.
Schwardmann’s initial investigations, like Chance’s study, started with an IQ test. After the test, the participants were required to privately rate their performance. They were not provided with the findings. They next participated in a persuasiveness exam in which they had to convince a panel of judges who were acting as possible employers of their intellectual brilliance. If the judges thought they were among the smartest in the group, they could win 15 euros ($16).
Before rating their level of confidence in their performance, some respondents were informed about the persuasion task, while others were informed subsequently. Schwardmann discovered that this altered their assessments of their skills: compared to those who had not yet been informed, those who knew in advance that they would need to persuade others showed higher levels of overconfidence in their skills. They had been conditioned to believe they were smarter than they were because they needed to persuade others.
He calls this a “reflex” of sorts. Schwardmann’s significant research demonstrated that lying to oneself increased one’s capacity to persuade fictitious employers.
Schwardmann has now noticed a comparable procedure in debate competitions. The topics for these events are distributed to the participants, who are then given 15 minutes to prepare their arguments before being randomly assigned a point of view to discuss. They are then assessed based on how persuasively they argue their position during the discussion.
Before participants were given a stance, after they began drafting their arguments, and following the debate itself, Schwardmann tested the participants’ own opinions on the subjects. He discovered that after people were informed of which side of the argument they would need to advocate, their ideas significantly changed, which is consistent with the theory that self-deception evolved to aid us in persuading others. According to Schwardmann, “their internal opinions shifted toward the side they’d been given just 15 minutes earlier — to match with their persuasion goals.”
Following the debate, participants also had the option of donating modest sums of money to charities of their choice from a lengthy list of prospective causes. Schwardmann discovered that even though the organizations were initially chosen at random, they were considerably more willing to select those that supported their perspective.
We may have developed many of our opinions in this way. It is possible in politics for a campaigner to convince themselves that their position is the only one that makes sense — not because they have carefully considered the evidence, but just because they were asked to make the case. Schwardmann believes that a large portion of the current political polarisation may be caused by this process.
Delusions of Grandeur
Our brains can deceive us into believing false information in all of these ways. Self-deception enables us to exaggerate our perception of our talents so that we think we are more intelligent than everyone else. It entails ignoring the effects of our activities on other individuals because we think that we are acting morally overall. Additionally, when we deceive ourselves about the truth of our views, we exhibit greater conviction in our convictions, which can help us persuade others.
We will never know what Holmes, Sorokin, Hayut, and other fraudsters were thinking, but it is simple to imagine how some of these mechanisms might have been in action. These con artists, at the very least, appeared to have excessively high beliefs of their abilities and their entitlement to obtain what they wanted; they also cheerfully dismissed any possible moral repercussions from what they were doing.
Holmes in particular appears to have had faith in her product and made an effort to defend her use of false information. She nonetheless asserted during her trial that “the huge medical device firms like Siemens could simply recreate what we had done,” despite all evidence to the contrary. Meanwhile, Hayut continues to insist that he is “the biggest gentleman” and that he did nothing wrong.
Schwardmann concurs that certain con artists might inhabit very complex lies. He notes that some people even display a form of righteous rage when they are questioned, which may be difficult to imitate. Maybe that means they believe their falsehood, he speculates.
It’s interesting to note that the desire for social status seems to make people more likely to lie to themselves. For instance, people are more likely to exaggerate their abilities when they feel threatened by others. It’s possible that we can tell ourselves bigger lies when the stakes are higher.
Most of the time, our self-deception is harmless and only serves to give us a false sense of self-assurance. But it’s always important to be conscious of these inclinations, especially when we’re considering choices that could have a major impact on our lives. You don’t want to mislead yourself about the dangers of shirking your responsibilities at work or the chances of a risky career move, for instance.
To “think the opposite” of your conclusions is an effective strategy for exposing all types of bias. The method is as simple as it sounds: you look for every possible reason why your view could be incorrect as if you were questioning yourself. This causes us to consider a problem more analytically, according to numerous research. This methodical reasoning is significantly more effective in lab tests than merely telling people to “think rationally.”
Of course, you can only do this if you can acknowledge your shortcomings. Admitting there is an issue is the first step. Perhaps you believe that you don’t require this guidance since you are completely honest with yourself and that only other people suffer from self-deception.