top of page

3. Once we have formulated a story, it becomes very difficult to change our minds. We over-value and over-empahsise anything that supports our story while discrediting any opposing evidence. 

This is the third of a 4-part series on "Stories"
In part 1, we examine how we automatically create stories to help us understand and explain what is happening in our lives.
In part 2, we explore how some of the stories we create are inaccurate

In part 3, we reflect on how it is very difficult to change the stories that we have created

In part 4, we look at how the stories we create go on to impact our lives
 

phqtq2agf1vy.jpg

Source:

Ever had an experience arguing with a family member, friend, or colleague?  While you provided facts and evidence to support your case, they didn't or couldn't. It was just angry disagreement. And however patient you are, however you explain, they just wouldn't listen?

IMG_0119 (1).PNG

The experience we might have - "how can this person possibly not accept something backed by established facts" - is not unique to us. As it turns out, refusing to change one's mind even in the face of evidence is commonplace, even for the brightest among us. This is particularly true if someone had made their views public previously. 
 

  • Over 97% of scientists agree that there is man-created climate change, yet 4 in 10 Americans reject this. 
     

  • An alarming proportion of parents reject vaccinations for their kids, fearing side effects and impact on the child's immune system. This story is based on pseudoscience, ie it contradicts fundamental biology. These parents put their own child at risk of disease that vaccines have been proven to protect against, because of something that will never happen.
    (How can we change the minds of people that are harming themselves as their loved ones because of an incorrect story? Read more about how the vaccination problem has been tackled)

     

  • The great Albert Einstein, one of mankind's most open, creative, and brilliant minds rejected many traditional stories to come up with his own groundbreaking theories. Theories we still admire today. Yet later in life, he succumbed to a fixation for his own story, in the face of contradictory evidence. Einstein firmly believed that the Universe is defined by laws of physics which can precisely calculate outcomes. Because of this deep belief, he rejected quantum mechanics up till his death - quantum mechanics that is based on a probabilistic chance that something will happen. Quantum mechanics has never been proven wrong in any experiment. 
     

  • The Church in the 16th century believed firmly in the story that Earth is the centre of the universe (this is actually the very opposite of what is reality). Scientists (who were also Catholic) like Nicholaus Copernicus and more famously, Galileo, found strong evidence against this story. They were persecuted. 
     

  • Neville Chamberlain, the former British Prime Minister, formulated a story about how Hitler could be trusted to not start an actual war, once some demands were met. We know how that turned out.
     

  • And as we will see a little later, a fierce and vicious fight occurred among the scientific community, on the possibility for humans to develop new neurons in our brains even as we age. These are people who should be most open to new research. Evidence for this was written off for decades, even mocked and insulted. Today, adult neurogenesis is a proven fact. Adult neurogenesis is also a very critical finding for how we can develop ourselves, and you can read more here)
     

What's​ causing us to reject changing our minds when evidence pops up. Let's take a look.

Does evidence and expert opinion change your view? 
Cass R. Sunstein, S. Bobadilla-Suarez, S. Lazzaro, and Tali Sharot,

“How People Update Beliefs About Climate Change: Good News and Bad News,”

Group A

Participants told: Scientists find new evidence that climate change much more serious than previously thought

Group A Climate Change Believers

Group A: Climate Change Deniers

Group B

Participants told: Scientists find new evidence that climate change much less serious than previously thought

Group B: Climate Change Believers

Group B: Climate Change Deniers

The experiment:

  • Climate Change Believers and Deniers were divided into 2 groups:

    • Group A were told that after recent evidence, scientists and experts have assessed that climate change is much more serious than previously thought​

    • Group B were told the opposite, that recent evidence shows climate change is much less serious than previously thought

  • Did people change their beliefs in light of the new expert assessments? Yes and No.
     

​

​

​

​

​

​

​

​

The results:​

  • People accepted the evidence only if it fits their original worldview

    • Deniers in group A questioned the validity of the new evidence, while Believers became more disheartened and fearful.

    • Deniers in group B felt triumphant that they were right, while Believers questioned the validity of the new evidence.

  • When presented with new information, we tend to quickly accept evidence that confirms our existing notions (prior beliefs) and cast a critical eye on counter-evidence, trying to find faults with it.

  • In fact, presenting people with information that contradicts their opinion can cause them to come up with altogether new counterarguments that further strengthen their original view - this is known as the “boomerang effect.”  Curiously enough, the more intelligent people are (admittedly by traditional measures like IQ tests), the more and capable they are to rationalize and interpret information at will, and to creatively twist data to fit their opinions. Ironically, intelligence is used not to draw more accurate conclusions, but to find fault in data they are unhappy with.

  • Have you ever argued with someone, or seen online debates where the more evidence is produced, the more resistant people are? Sometimes it becomes pretty ridiculous, as Mr Bean will attest. 

 

That's not Mr Bean!
Video is 1 min 18s long

results climate change.png
results climate change.png

Where was he born?

​

Where was Barack Obama born? Somehow, this became a multi-year debate with a large number of conspiracy theories. Even after his full birth certificate was released, polls showed that up till 20% of Americans still do not believe that their President was born in America. It even sparked off its own Wikipedia page, and provided us with hilarious interview segments(excerpt from the daily show, full video here

 

Video is 1 min 11 secs long 

Some of us might already be able to name the effect at play in the examples above - confirmation bias. When we have an existing belief, confirmation bias causes us to notice, much more, evidence that supports our existing belief. The most difficult stories to change are incorrect ones with some amount of truth to them.

But why do we reject evidence that contradicts our original story? Let's take a look into the world of cognitive dissonance. 

​Cognitive Dissonance (Festinger, Stanford University)
(This is a very famous study that pioneered cognitive dissonance theory.You can read the full paper on their study here. Festinger is the 5th most cited psychologist in the 20th century)

​

​










 

The experiment

  • Participants were asked to perform a series of dull tasks for over an hour, (such as repeatedly putting pencils into drawers, turning things 90 degrees) intended to bore them.

  • When the tasks were completed, the experimenter appealed to the participant to do a short briefing for the next waiting subject.

  • Participants were promised either a $1 or $20 incentive (randomly assigned) if they told the next waiting subject that the tasks were really interesting. 

  • After the participants had done the briefing, they were asked about their own honest assessment of whether they found the tasks enjoyable:

    • Those paid $20 openly shared that they found the tasks terribly boring​

    • Those paid $1 shared that they enjoyed the tasks and found them interesting

  • What??​  What happened?? Why would people who were paid $1 lie that they enjoyed the tasks when they clearly didn't? And why did those paid more not lie about it? 

​

Understanding Cognitive Dissonance

 

  • All the participants didn't enjoy the tasks. 

  • So when they described the tasks to the next batch of subjects, everyone lied. 

  • They were then asked to share their honest review. If they had been asked what their opinion was before having to explain to the subjects, all their answers would have likely been the same.

  • But having taken action, humans naturally feel the need to justify we have done.

  • For those paid $20, the incentive was enough to justify their act of lying, i.e. "I was willing to tell the new subject it was enjoyable even when I didn't actually think so because I received a good incentive." The dissonance they feel is resolved from the money they received. 

  • But for those paid just $1, it's much harder to justify to themselves (and to the experimenter) why they had willingly told such a big lie for such a small incentive, i.e. "I thought the tasks were really boring; yet I told the subjects the tasks were interesting for just a small token of $1. There is a dissonance that causes them great discomfort.

  • And so these participant sought to resolve their dissonance. They convinced themselves that they really thought the tasks were enjoyable. This way, it it would not violate any existing beliefs. If they "really enjoyed" the tasks, then they did not lie, they did not succumb to doing something opposite of what they thought for just a meagre sum of $1. 

  • The dissonance is resolved. 
     

Or as Festinger puts even more starkly:

​

||    "The general principle seems that people come to believe in and to love the things they have to suffer for."

​

Cognitive dissonance comes to play very regularly in our lives

  • We want to lose weight
  • But we've eaten far too much this month
  • So we tell ourselves we will diet harder next month

It could be a change that we want to make in life:​

  • You're unhappy with your job​

  • but we're scared to make a change

  • So we convince ourselves that perhaps the opportunity that came wasn't right for us.

We see this in our interactions with friends and family:​

  • We want to spend more time with the people that matter​

  • But hey it's a busy period, colleagues are working hard, you don't want to look back in comparison

  • So we console ourselves that we had good intentions but it's just a bad period.


We also see cognitive dissonance in larger arenas. For example, we often think less of people who make a major U-turn after openly declaring their position. We view politicians who change their position as less trustworthy. But think about it, this is quite an incomplete consideration process isn't it?  On one hand, we do want to see politicians keep their promises, but on the other, what's the probability that they have always been right about everything? Shouldn't we acknowledge that they can keep up with new evidence? Another example of this would be our bosses - do you know of a boss who never changes his/her mind, no matter what is presented, especially after he/she had publicly stated their position?

​

We experience discomfort if there is disharmony between what we believe vs how we actually behave, and/or new information that is presented to us. To reduce our discomfort, sometimes we change our story to become one that is harmonious with how we act. (In our chapter on how the brain is wired for survival, we also examine how the brain feel threatened when we are wrong, or we think others think we are wrong). 

To reduce the discomfort we have, we often choose comforting lies to explain things away, rather than face unpleasant truths.

So far we've learnt 2 major reasons why it's very difficult to change our minds once we have formulated a story:
 

1) The confirmation bias - where we look for evidence that supports our views and ignore or discredit contradictory evidence
 

2) Cognitive dissonance - we experience great discomfort if there is some disharmony between your beliefs and your behaviour, and/or new information that is presented to you. To reduce this discomfort, we choose to explain the unpleasantness away, rather than seek out the truth. 

​

One additional factor: Do we think we know more than we actually do?

Q1. Would you rate yourself better than average at driving?

​

Q2. How would you rate yourself in your company compared to others?

​

Q3. How financially literate are you? If you're financially literate, you will be familiar with these terms:

finance.png

Knowing how good you are in a particular area is important. It tells us how much we can trust our own views and judgement in the area, or if we should really be seeking out advice from others that are better. 

Unfortunately, results from hundreds of studies unanimously show that we tend to overestimate our abilities. We consistently rank ourselves better than average in a wide variety of areas. 

Let's take a look at the questions above:
To Q1: 88% of American drivers described themselves as having above average driving skills.

 

To Q2: Out of 2 companies surveyed, 32% of software engineers in one and 42% of software engineers in the other rated themselves in the top 5% of software engineers. 
 

And to Q3: So how financially literate are you? Do you know the listed terms? Well did you notice that the last 3 terms don't exist? They are completely made up....

David Dunning and Justin Kruger studied the relationship between how much people knew and how much they thought they knew and came up with the Dunning- Kruger effect in 1999.
 

dunning-kruger_effect.jpg

The Dunning-Kruger effect shows that it is the folks with a little knowledge in an area, that tends to vastly overestimate themselves. In fact, those lacking skill and knowledge in an area suffer a double curse: 

1) they make mistakes and come to poor decisions

2) they lack the expertise to recognise the errors they were making and assume they are right.

​

There are 2 things we can do to reduce the Dunning-Kruger effect:  
1) Ask for feedback, even if it might be painful 

2) Most importantly, keep learning! The more we learn, the more we are able to plug the holes in our knowledge that would have been invisible to us. 

3) These 2 points will come up again and again in these pages. To start, the video below gives you a bit more info on the Dunning-Kruger effect.

​

​

​

So the adage that a little knowledge is the most dangerous, rings true. Together with confirmation bias and cognitive dissonance, the Dunning-Kruger effect leads to us forming incorrect stories about what happens in our lives, and yet be very confident that we are right. 

But so what if we are unable to change our stories? 

In our next and final chapter, we examine the effects the stories have on our lives. 

Try This!

bottom of page