February 3, 2021
The Way We Are: Cognitive Biases
“We do not see things as they are, we see them as we are”. Anais Nin said it best, and I say it a bit longer on this talk about cognitive biases and heuristics.
“We do not see things as they are, we see them as we are.”
Anaïs Nin, 1961
Do you know where your opinions and beliefs come from? We all like to think they result from our experiences and objective analysis of information we collect throughout our lives. But in reality, every single one of us -regardless of education, field of work and background- we all share a vulnerability that stems from the fact that we are all human.
Even as our most primitive ancestors walked the earth, the amount of information surrounding us was already overwhelming. Decisions needed to be made and quickly, it ensured our survival. Thus, biological selection mechanisms allowed for humans to evolve with shortcuts built into our thought processing. These shortcuts, also known also as heuristics, allowed us to jump to conclusions quickly and react in times where speed of reaction was better at guaranteeing survival than accuracy.
However, as our brains, our lives, and the world became more complicated, we didn’t move past these shortcuts, and what was once our strength became our vulnerability. They are known as cognitive biases, as named by Amos Tversky and Daniel Kahneman in 1982, and are studied by cognitive scientists, psychologists and behavioral economists and even marketers to understand our decision-making processes, and the catastrophic results that these biases can yield for our own wellbeing.
Some of you might be very knowledgeable of these cognitive biases, others might’ve never heard the term before, but surely once I describe them, you will identify the scenario even if you didn’t know the fancy name that goes along with it.
Cognitive biases are as simple as the bandwagon effect or the effect of authority, by which we are more likely to believe that what the majority or an authority figure, respectively, state is the correct answer. Regardless of the objective facts.
The attribution error: What you do wrong is an error in your character, what I do wrong is because of circumstances. The Dunning-Kruger effect which shows how the more knowledgeable you are on a subject, the more insecure you feel about it, and vice versa.
Confirmation bias is perhaps the most known and textbook case of cognitive biases; it is the tendency we all have to seek information that confirms our previous beliefs, and measure it disproportionately against information that contradicts them. If you’ve ever heard of the frequency illusion, this is very similar. The frequency illusion occurs when you buy a new car, and suddenly you see the same car everywhere. Or when a pregnant woman suddenly notices other pregnant women all over the place.
The Gambler’s fallacy: let’s take a regular euro coin and, for the purposes of this example, we will say the face with the number is heads, and the shield is tails. If we toss it, and it lands on heads, and we do it again, heads again. Do you think the next toss is more likely to be heads again? Do you think it’s more likely to be tails? The fact is, that it is both are wrong, there is still a 50-50 chance, but our shortcuts make us believe that previous events somehow influence current situations.
There is, unfortunately, no cure for cognitive biases. In fact, there is even the bias by which people believe they are less affected by cognitive biases than they really are. However, knowing about these biases and considering their effects in your own decision-making process and people around you will help you account for them and perhaps even make better decisions.
We are not seeing the world as it is; we are seeing it as we are.
*This blog post was originally a speech delivered in Toastmasters’ Speech Contests in Northern Germany in 2017. Part of the meaning might be lost in translation to text.
1 year ago
Cool Story Bro