Digging Into Cognitive Biases:

Have you had a conversation with someone and knew the information they were stating was incorrect? Did you try to correct the person by letting them know that they had the ‘facts’ wrong? Maybe you even accessed the internet to do a fact check from reliable and vetted sources. Instead of saying thanks, the person became upset, argumentative and claimed the facts presented are wrong or the information is fake. Perhaps they then reiterated the misinformation with more gusto. It is a common frustration.

 

The adage “seeing is believing” forgets that our sensory perceptions of sight and hearing can be tricked. This is why magic tricks which rely on the power of illusions can appear so real. Magic is fun to the observer because no one is harmed, and it is a fun way to believe something extraordinary happened without any real consequences. When it comes to misinformation, people can be harmed in a myriad of ways. That is why having access to the facts can be important. It informs the choices we make for our daily lives and the world we live in. 

 

Technology and the advancements in AI have changed the world and how information is transmitted. Today, information spreads at a rate previously unimaginable in human history. It is incredibly easy to access the wealth of human knowledge. The issue is it is also very easy to spread misinformation. Once people believe misinformation, it can be hard to change their minds. There are several reasons why misinformation can retain a powerful hold on people. 

 

Navigating all the information that comes at us in the world takes a lot of mental and physical energy. Each person’s brain designs shortcuts to speed up the time it takes to do activities. This is particularly noticeable when first learning how to do something new. When someone first learns how to walk, it takes awhile to figure out which muscles to activate and coordinate movement. A person is wobbly, the balance is off and they are slow. As time goes on and a person gains the practice of walking, they do it without really thinking about it. It is because the brain figured out how to forge the connections that quickly enable the set of skills needed to walk with stability and swiftness. 

 

The brain creates shortcuts for most of what we do including connecting a series of thoughts formed through associations. This happens quickly through cognitive processes discovered by psychologists Amos Tversky and Daniel Kahneman. Their work brought forth the awareness of  cognitive biases which are a systematic error in thinking that impacts the way a person processes and interprets information. Biases affect our decisions, judgments and actions. Typically, we see ourselves as objective, logical and able to accurately evaluate all the information available to us. What we often don’t recognize are the biased thoughts we all have that are influenced by emotions, motivations, upbring, social pressures and our identity. It is not until someone with a differing cognitive bias shares their experience that we start to pay closer attention because something seems off. 

 

What are some signs of cognitive bias? Favoring news outlets that share your worldview, viewing others' success as luck while crediting yourself for personal accomplishments, learning some information about a subject and thinking you fully understand it, assuming others share your beliefs. There are over 180 cognitive biases which are captured in this cognitive biases codex infographic. There are a lot of ways for biases to creep into our lives. 

 

When people dig further into their beliefs, even when those beliefs are proven to be incorrect, there are several biases that influence the choice. Confirmation bias looks for information that supports preconceptions while rejecting any conflicting information. In sports, if there is a referee and they make a call that sides with our team it is a good call. If the referee’s call sides with the other team it was a bad call. The referee’s call could be good in both cases, but if we think the other team should lose then we become convinced they must have done something wrong. That was clearly a bad call. The Anchoring bias is when people remember the first thing they heard or saw. If someone’s first exposure to information is misinformation, they are more likely to remember and maintain the belief even when presented with the correct information. It can take a lot to overcome this hurdle. Dan Kahan, professor of Psychology at Yale Law, developed the concept of Identity Protective Cognition where a person is likely to dismiss evidence that doesn’t reflect the dominant beliefs of their group. According to Kahan, beliefs and political views become “a badge of membership with identity-defining affinity groups” with the main goal of the individual being to “protect one’s status within the affinity group.” Here accuracy competes with identity by affirming or threatening one’s sense of self as part of a group. 

 

Considering these biases, it is easier to understand why people don’t change their minds when shown correct information because of how they interpret and evaluate the evidence. People are fast to accept information that confirms what they already believe and are critical of evidence that is contradictory to their beliefs. When a person makes a mistake, they feel psychological pain. The same parts of the brain that light up during physical injuries can get activated when psychological pain happens. When people make a mistake they often feel shame and it hurts. Having our beliefs challenged can activate a fight or flight response. This is why some people get defensive and aggressive and others quickly end the conversation. To avoid pain, it is easier to hold tight to beliefs that may be wrong then go through the physical pain of accepting being wrong. When people dig into their beliefs, ignore the facts, and hold steadfast to misinformation, it is to protect themselves from psychological, physical and social pain. 


It takes courage and a willingness to sit in discomfort to let go of misinformation; much like the process of wading through ambivalence. When people enter a state of ambivalence where they consider the pros and cons of multiple sides before believing or doing something, they gain the ability to develop a more balanced decision making process. It turns out that ambivalence has the ability to reduce biases by reducing one-sided thoughts and causing people to be more open to diverse ways of thinking. This can lead to more accurate informed choices. Cognitive biases can lead to distorted thinking. To reduce their influence, accept that everyone has them. Think about what is influencing your decisions and perspectives. Actively challenge your biases and admit to mistakes. Give yourself time to sit in ambivalence and learn from the process.