Guided Learning
Introduction
The human mind is the result of millions of years of evolution. For the most part, evolution has set us up for success in a constantly changing world. But what used to be advantageous for us in the past is not always advantageous today; unfortunately, our brains don’t seem to have realized this yet!
Discover some of the common cognitive mistakes our brains have evolved to make in the modules below. Do these biases impose costs on you – costs to your finances, health, or happiness?
The culprit is our faith in our own intuitions over the facts. No matter our personality or profession, we can all agree that it’s smart to be aware of the quirks of human cognition so that we can outsmart them.
Module1 of 3
“The DNA is a Match”: Confirmation Bias
Podcast
We have access to more information than ever before, but are we making better decisions? Learn how a pervasive bias skews how we interpret the data in front of us.
Listen to the podcast
“The DNA is a Match”: Confirmation Bias
►
Play video
Transcript
NARRATOR: What would you think if you watched a forensic expert get on the witness stand and say 5 little words:
CLIP: “The DNA is a match”?
NARRATOR: The first thought that went through my mind? Got ‘em.
DNA is the gold standard of forensic evidence; indisputable, right? I mean, we’re talking about the stuff we send out to the lab to find out if we’re .2% Italian (or in my neighbor’s case: 2% Neanderthal)! But in 2011, neuroscientist Itiel Dror and biologist Greg Hampikian teamed up to show that even DNA analysis can be vulnerable our blind spots.
Today, we’re talking about one of the strongest biases out there: confirmation bias. Welcome to Outsmarting Implicit Bias.
Imagine this: you’re a forensic analyst on a criminal case.
CLIP: “Fingerprint check on Panacek.”
NARRATOR: The prosecutors tell you they have testimony from one of the suspects…
CLIP: “I will give you immunity if you testify.”
NARRATOR: …and he’s revealed everyone else who was involved in the crime. The only snag? Unless you can place at least one of the other people at the scene, the judge is probably throwing this testimony out.
So: those are the stakes. You have the DNA mixture from the crime (that means it includes more than one person’s DNA). And you have DNA from the prime suspects. So. Is there a match?
These are details from a real criminal case. And the real analysis team… did find a match! Suspect 3 was there. “Got ‘em”, right?
Not exactly. Years later, Dror and Hampikian sent the same DNA to 17 new experts: Did they see Suspect 3’s DNA in the mixture?
16 out of 17 experts said no.
This is the kind of situation where I think it’s fair to ask “What the heck happened?”
16 “nos” seems like the opposite of a “gotcha” moment. And the original analysis team was qualified and competent — how did they see something so different?
The answer: confirmation bias. Our tendency to search for or interpret new information in a way that supports what we already believe.
You see, the one difference between the original analysts and the new ones was what they thought was true. The original team knew that there was incriminating testimony; they knew that without a DNA match, the accused could go free. The new analysts didn’t. So they were looking at the data with untainted eyes.
And while DNA analyses are basically error-proof when it comes to single samples, when it comes DNA mixtures, there’s a lot more ambiguity. Is that really an allele, or just a blip in the data? Is there really enough information to make a conclusion, or is the sample too degraded or complex to even try?
CLIP: “The blood and DNA tests came back inconclusive.”
NARRATOR: These are the kinds of decisions points where bias can come in – even in honest people who are trying to be accurate!
Confirmation biases can affect all of us… and what makes them so hard to outsmart is that we think we’re finding data that proves us right! For instance: let’s say you just know that a colleague didn’t deserve that promotion. You see him make a little mistake and aha! Proof, right? …even if it’s a slip you would have forgiven in anyone else.
What’s more, researchers at SUNY Stony Brook University found that when our opinions are challenged – for instance, if you have to read both sides of a debate about gun control or affirmative action – our original beliefs only get stronger after hearing the opposite side. This is bizarre!
And it happens because we pick and choose. We focus on the strengths of the arguments we like, and the flaws of the arguments we don’t. And that leaves us feeling even more firm in our convictions.
There are so many ways the confirmation bias can affect us: Maybe you want to invest in a new business venture, and that’s making you downplay the risks. Maybe you’re completely smitten by a new crush, so you miss seeing all the red flags. All this tells us that if we want to outsmart our minds, we have to learn how to switch from confirming an idea to proving it.
Here’s a few tips:
The desire to confirm can give us tunnel vision, but hindsight is 20/20. So get used to conducting premortems. That is: imagine that the unthinkable HAS happened.
CLIP: “Breaking headlines…”
NARRATOR: The company did go under.
CLIP: “…filing for bankruptcy…”
NARRATOR: The relationship did implode.
CLIP: “…the two reportedly broke up…”
NARRATOR: What killed it? What signs did you miss? Working backwards like this pushes you to find the clues that fell into your blind spot.
Another solution for the hypochondriacs among us: rethink what you’re typing into that search bar. “Do I have cancer?” is only going to confirm your worst fears, not give you the most likely answers. Instead, plug in your symptoms (or better yet) consult a professional.
Finally, reframe your goals. Don’t make half-hearted pro-con lists that just tell you want to hear. Whether it’s a personal decision or an analysis for a Supreme Court case, make sure everyone knows “success” means making the best decision, and not just the one you think is right.
Outsmarting Implicit Bias is a project founded by Mahzarin Banaji, devoted to improving decision-making using insights from psychological science. The team includes Olivia Kang, Evan Younger, Kirsten Morehouse, and Mahzarin Banaji. Research Assistants include Moshe Poliak and Megan Burns. Music by Miracles of Modern Science. Support comes from Harvard University, PwC, and Johnson & Johnson. For references and related materials, visit outsmartinghumanminds.org.
Expand
Transcript
NARRATOR: What would you think if you watched a forensic expert get on the witness stand and say 5 little words:
CLIP: “The DNA is a match”?
NARRATOR: The first thought that went through my mind? Got ‘em.
DNA is the gold standard of forensic evidence; indisputable, right? I mean, we’re talking about the stuff we send out to the lab to find out if we’re .2% Italian (or in my neighbor’s case: 2% Neanderthal)! But in 2011, neuroscientist Itiel Dror and biologist Greg Hampikian teamed up to show that even DNA analysis can be vulnerable our blind spots.
Today, we’re talking about one of the strongest biases out there: confirmation bias. Welcome to Outsmarting Implicit Bias.
Imagine this: you’re a forensic analyst on a criminal case.
CLIP: “Fingerprint check on Panacek.”
NARRATOR: The prosecutors tell you they have testimony from one of the suspects…
CLIP: “I will give you immunity if you testify.”
NARRATOR: …and he’s revealed everyone else who was involved in the crime. The only snag? Unless you can place at least one of the other people at the scene, the judge is probably throwing this testimony out.
So: those are the stakes. You have the DNA mixture from the crime (that means it includes more than one person’s DNA). And you have DNA from the prime suspects. So. Is there a match?
These are details from a real criminal case. And the real analysis team… did find a match! Suspect 3 was there. “Got ‘em”, right?
Not exactly. Years later, Dror and Hampikian sent the same DNA to 17 new experts: Did they see Suspect 3’s DNA in the mixture?
16 out of 17 experts said no.
This is the kind of situation where I think it’s fair to ask “What the heck happened?”
16 “nos” seems like the opposite of a “gotcha” moment. And the original analysis team was qualified and competent — how did they see something so different?
The answer: confirmation bias. Our tendency to search for or interpret new information in a way that supports what we already believe.
You see, the one difference between the original analysts and the new ones was what they thought was true. The original team knew that there was incriminating testimony; they knew that without a DNA match, the accused could go free. The new analysts didn’t. So they were looking at the data with untainted eyes.
And while DNA analyses are basically error-proof when it comes to single samples, when it comes DNA mixtures, there’s a lot more ambiguity. Is that really an allele, or just a blip in the data? Is there really enough information to make a conclusion, or is the sample too degraded or complex to even try?
CLIP: “The blood and DNA tests came back inconclusive.”
NARRATOR: These are the kinds of decisions points where bias can come in – even in honest people who are trying to be accurate!
Confirmation biases can affect all of us… and what makes them so hard to outsmart is that we think we’re finding data that proves us right! For instance: let’s say you just know that a colleague didn’t deserve that promotion. You see him make a little mistake and aha! Proof, right? …even if it’s a slip you would have forgiven in anyone else.
What’s more, researchers at SUNY Stony Brook University found that when our opinions are challenged – for instance, if you have to read both sides of a debate about gun control or affirmative action – our original beliefs only get stronger after hearing the opposite side. This is bizarre!
And it happens because we pick and choose. We focus on the strengths of the arguments we like, and the flaws of the arguments we don’t. And that leaves us feeling even more firm in our convictions.
There are so many ways the confirmation bias can affect us: Maybe you want to invest in a new business venture, and that’s making you downplay the risks. Maybe you’re completely smitten by a new crush, so you miss seeing all the red flags. All this tells us that if we want to outsmart our minds, we have to learn how to switch from confirming an idea to proving it.
Here’s a few tips:
The desire to confirm can give us tunnel vision, but hindsight is 20/20. So get used to conducting premortems. That is: imagine that the unthinkable HAS happened.
CLIP: “Breaking headlines…”
NARRATOR: The company did go under.
CLIP: “…filing for bankruptcy…”
NARRATOR: The relationship did implode.
CLIP: “…the two reportedly broke up…”
NARRATOR: What killed it? What signs did you miss? Working backwards like this pushes you to find the clues that fell into your blind spot.
Another solution for the hypochondriacs among us: rethink what you’re typing into that search bar. “Do I have cancer?” is only going to confirm your worst fears, not give you the most likely answers. Instead, plug in your symptoms (or better yet) consult a professional.
Finally, reframe your goals. Don’t make half-hearted pro-con lists that just tell you want to hear. Whether it’s a personal decision or an analysis for a Supreme Court case, make sure everyone knows “success” means making the best decision, and not just the one you think is right.
Outsmarting Implicit Bias is a project founded by Mahzarin Banaji, devoted to improving decision-making using insights from psychological science. The team includes Olivia Kang, Evan Younger, Kirsten Morehouse, and Mahzarin Banaji. Research Assistants include Moshe Poliak and Megan Burns. Music by Miracles of Modern Science. Support comes from Harvard University, PwC, and Johnson & Johnson. For references and related materials, visit outsmartinghumanminds.org.
Expand
Highlights
Key takeaways from this module
Confirmation bias emerges from our tendency to pay attention to the data that proves us right rather than data that can prove us wrong.
The bias can lead to costly mistakes like missing the risks of a new venture or a red flag in a new relationship.
Combat confirmation bias by actively seeking data that proves (not just confirms!) your opinion while also explicitly seeking information that disproves it.
Try conducting a mental pre-mortem: work through the worst possible outcome, giving yourself the chance to encounter the potential downsides to a particular choice.
Test your knowledge
Module2 of 3
What Sentence Would You Give?:
Watch for the Anchoring Bias
Video
Two people can receive very different sentences for committing identical crimes. Learn about one hidden bias that can distort justice.
Watch the video
What Sentence Would You Give?: Watch for the Anchoring Bias
►
Play video
Transcript
NARRATOR: We’ve all heard of cases where two people receive very different sentences for committing identical crimes. It’s perhaps one of life’s greatest injustices; and erodes our faith in a system that purports to be blind.
Now, there are many reasons why this might happen.One of them is a cognitive bias called The Anchoring Effect.
Here’s the study: psychologists Birte Englich and Thomas Mussweilergave 19 judges (real judges, with robes, and courtrooms) information about a felony case and asked: “If you were the judge here, what sentence would you give?”.
Everyone got the same facts: details about the crime, relevant testimony, the penal code.
There was only 1 difference: the sentencing recommendation. Half the judges were told that the prosecutor was recommending a 2-month jail sentence; the other half were told the recommendation was for 34 months. “2” versus “34”. That’s it, the only difference. But this number stuck.
Even when all the other facts were identical, judges who saw “2 months” suggested an average prison sentence of 19 months. The judges who saw “34 months” suggested 29 months – that’s nearly a year longer in prison.
And if you’re thinking: Wait, the prosecutor’s an expert on the case. Doesn’t it make sense for judges to listen to their suggestion?
The researchers wondered that too. After all, judges are literal pros at being unbiased. They’re trained to take in facts objectively, to value expert opinion.
So the researchers ran more tests. What would happen if the recommendation now came from a computer science student – a freshman in college?
Or: What if judges just threw dice (secretly rigged to land on either a high number or a low number) right before making their decision?
In both of these experiments, the pattern stayed the same. Seeing a higher number (regardless of where it came from) led to a higher sentence at decision-time.
Clearly, there’s a mental blindspot at work. Critical decisions are being affected by numbers that shouldn’t matter!
So. What is this blindspot? It was named by Nobel-prize winning psychologist Daniel Kahneman and Amos Tversky : the anchoring bias.
Simply put: when we make decisions, we rely too much on the last piece of information we got. Think about that sentencing recommendation (or the dice) as an anchor: our minds put it down, much like a boat does. And then, just like that boat, our next moves, our decisions sort of float around that anchor… but they can’t get too far.
And if anchors can snag judges, they can snag us all. Research shows anchoring affects real estate agents’ estimates of property values, doctors’ medical diagnoses, and consumers’ purchases of everything from wine and chocolate to cars and stocks.
So, what can we do?
When you have to make a decision, consult broadly so that you’re not anchored on a single number. Check multiple sources, markets, companies, cases, to hone in on a range that’s right.
And if you don’t have the luxury of time? How do you stop an absurd number from snagging your mind? Here’s a negotiation tip from Daniel Kahneman: “Make…. A… Scene”. Go ahead! Shake your head; throw your hands down; say “I will not consider that number!” to wipe the slate clean. Even if you’re just alone in chambers!
Our minds will always be vulnerable to anchors. But by recognizing their influence and staying vigilant, we can loosen their hold. That’s what outsmarting our minds is all about.
Expand
Highlights
Key takeaways from this module
Prior information can act as an “anchor,” and our judgments tend to stay close to the anchor even though it should be ignored.
The anchoring bias affects many decisions we make, from doctors’ diagnoses to consumer purchasing decisions.
To "shake off" these anchors: (1) Consult broadly with others; use multiple, diverse sources of information. Increasing the number of anchors may dilute the effect of any single one.
(2) Start each new decision by wiping the mental blackboard clean; you can even do this physically by creating a blank screen or using a new sheet of paper.
(3) “Make a scene”: Tell yourself (out loud!) that you won’t consider the anchoring number.
Module3 of 3
The Availability Bias
Podcast
What’s more likely: death by shark attack or by lightning strike? Most people choose the wrong answer because of this common cognitive bias.
Listen to the podcast
The Availability Bias
►
Play video
Transcript
MAHZARIN BANAJI: So, let me start by giving you an example of a very simple experiment.
NARRATOR: Welcome to Outsmarting Implicit Bias. You’re listening now to Professor Mahzarin Banaji.
MAHZARIN BANAJI: It was done at the University of Michigan a long time ago now, and here’s what they did: They said to half the people in their experiment: “Please write down 3 reasons for why it is that you love your partner,” okay? The other group is told something very similar: write down for us 9 reasons for why you love your partner …
Imagine doing this, and then at the end of this very simple exercise, you’re asked a few questions: How happy are you in this relationship? How much do you love this person? … And so on as a measure of the intensity and the depth of your love.
The data show that those of you who wrote down 3 reasons for your love ended up expressing much greater happiness and satisfaction with your relationship than those of you poor people on this side who wrote down 9 reasons for your love. Why should this be?
Well, the answer is actually kind of simple from the point of view of this 3-pound organ between your ears. The answer is: Who has 9 good qualities? This is not an easy thing to do at all!
And so, think about what your brain did. For you, 3 was easy to write down. But your brain wasn’t just writing down those three reasons as asked to do; it’s computing a million other things. Among them: how easy is this?
And at the end of the writing down of the three, if it felt easy to you, that actually influences how much happiness you express. But you guys on this side, think about your job. By the time you get to #6, your hand is slowing down, and your brain notices that. And you know that you just made up 7, 8, and 9 – they’re not even true! And it didn’t let go of that, it said this was hard, maybe I’m not so happy, maybe we should see a marriage counselor.
NARRATOR: This is an example of the availability bias, first discovered by psychologists Amos Tversky and Daniel Kahneman. It’s a blindspot that tricks us into thinking that the more easily something comes to mind, the more true or frequent it must be.
You just heard an example of how this bias can warp our perceptions of truth: how happy we are in our relationships. But it doesn’t end there: the same researchers (led by Norbert Schwarz) showed that the availability bias can even change how we view our selves. Many of us think we have a good grasp of who we are: whether we’re confident, kind, happy, ambitious. But challenge yourself to come with 12 examples of times you behaved assertively, for example. The science predicts you’ll believe you’re less assertive than if you only had to come up with 6. Strange but true.
Now apply this idea of “availability” to how often you think things happen in the world. For example: what’s more likely? Death by shark attack or death by lightning strike? The science suggests you’ll choose “shark attacks”. It’s the wrong answer… but it makes sense. Shark attacks get more media coverage; they’re the subject of blockbuster movies. So these examples stick and come to mind more easily. And that makes our brains think “Must happen a lot”.
Now, how does this blurred line between what’s true and what’s available affect us as work?
MAHZARIN BANAJI: Several years ago, I was listening to the senior-most members of an organization discuss a problem with dread in their voices. This is what they said: we hire women in their … 20s, but then (and their voices lowered as they spoke), the biological clock, you know…
NARRATOR: The question the company was facing was: how could they navigate these replacement costs, and was it worth the investment? But then the question became moot.
MAHZARIN BANAJI: I was looking at their data, and to my great surprise, men and women were actually leaving at the same rate. And the first reason both groups gave for leaving? To work for a competitor.] The leaders were shocked. How did the truth get so twisted?
MAHZARIN BANAJI: The availability bias.
NARRATOR: This belief is a common one: women leaving to raise children is something we hear about, something that managers might worry about… so when it does happen, it sticks. And then… you know the rest. Clearly this mental shortcut doesn’t always lead us to the right answer. So how do we outsmart it?
MAHZARIN BANAJI: The solution is so simple I’m embarrassed to even propose it to you: look at the data that are right in front of you!
NARRATOR: And this applies to everyone. A recent global survey by the ICEDR of hundreds of millennials found that men and women around 30 leave work at the same rates, for the same reasons. The data are there. They can help us to sift what’s true from what’s simply available. We just have to look.
Outsmarting Implicit Bias is a project founded by Mahzarin Banaji, devoted to improving decision-making using insights from psychological science. The team includes Olivia Kang, Kirsten Morehouse, Evan Younger, and Mahzarin Banaji. Special thanks to Miracles of Modern Science for music. Support comes from Harvard University, PwC, and Johnson & Johnson.
Expand
Transcript
MAHZARIN BANAJI: So, let me start by giving you an example of a very simple experiment.
NARRATOR: Welcome to Outsmarting Implicit Bias. You’re listening now to Professor Mahzarin Banaji.
MAHZARIN BANAJI: It was done at the University of Michigan a long time ago now, and here’s what they did: They said to half the people in their experiment: “Please write down 3 reasons for why it is that you love your partner,” okay? The other group is told something very similar: write down for us 9 reasons for why you love your partner …
Imagine doing this, and then at the end of this very simple exercise, you’re asked a few questions: How happy are you in this relationship? How much do you love this person? … And so on as a measure of the intensity and the depth of your love.
The data show that those of you who wrote down 3 reasons for your love ended up expressing much greater happiness and satisfaction with your relationship than those of you poor people on this side who wrote down 9 reasons for your love. Why should this be?
Well, the answer is actually kind of simple from the point of view of this 3-pound organ between your ears. The answer is: Who has 9 good qualities? This is not an easy thing to do at all!
And so, think about what your brain did. For you, 3 was easy to write down. But your brain wasn’t just writing down those three reasons as asked to do; it’s computing a million other things. Among them: how easy is this?
And at the end of the writing down of the three, if it felt easy to you, that actually influences how much happiness you express. But you guys on this side, think about your job. By the time you get to #6, your hand is slowing down, and your brain notices that. And you know that you just made up 7, 8, and 9 – they’re not even true! And it didn’t let go of that, it said this was hard, maybe I’m not so happy, maybe we should see a marriage counselor.
NARRATOR: This is an example of the availability bias, first discovered by psychologists Amos Tversky and Daniel Kahneman. It’s a blindspot that tricks us into thinking that the more easily something comes to mind, the more true or frequent it must be.
You just heard an example of how this bias can warp our perceptions of truth: how happy we are in our relationships. But it doesn’t end there: the same researchers (led by Norbert Schwarz) showed that the availability bias can even change how we view our selves. Many of us think we have a good grasp of who we are: whether we’re confident, kind, happy, ambitious. But challenge yourself to come with 12 examples of times you behaved assertively, for example. The science predicts you’ll believe you’re less assertive than if you only had to come up with 6. Strange but true.
Now apply this idea of “availability” to how often you think things happen in the world. For example: what’s more likely? Death by shark attack or death by lightning strike? The science suggests you’ll choose “shark attacks”. It’s the wrong answer… but it makes sense. Shark attacks get more media coverage; they’re the subject of blockbuster movies. So these examples stick and come to mind more easily. And that makes our brains think “Must happen a lot”.
Now, how does this blurred line between what’s true and what’s available affect us as work?
MAHZARIN BANAJI: Several years ago, I was listening to the senior-most members of an organization discuss a problem with dread in their voices. This is what they said: we hire women in their … 20s, but then (and their voices lowered as they spoke), the biological clock, you know…
NARRATOR: The question the company was facing was: how could they navigate these replacement costs, and was it worth the investment? But then the question became moot.
MAHZARIN BANAJI: I was looking at their data, and to my great surprise, men and women were actually leaving at the same rate. And the first reason both groups gave for leaving? To work for a competitor.] The leaders were shocked. How did the truth get so twisted?
MAHZARIN BANAJI: The availability bias.
NARRATOR: This belief is a common one: women leaving to raise children is something we hear about, something that managers might worry about… so when it does happen, it sticks. And then… you know the rest. Clearly this mental shortcut doesn’t always lead us to the right answer. So how do we outsmart it?
MAHZARIN BANAJI: The solution is so simple I’m embarrassed to even propose it to you: look at the data that are right in front of you!
NARRATOR: And this applies to everyone. A recent global survey by the ICEDR of hundreds of millennials found that men and women around 30 leave work at the same rates, for the same reasons. The data are there. They can help us to sift what’s true from what’s simply available. We just have to look.
Outsmarting Implicit Bias is a project founded by Mahzarin Banaji, devoted to improving decision-making using insights from psychological science. The team includes Olivia Kang, Kirsten Morehouse, Evan Younger, and Mahzarin Banaji. Special thanks to Miracles of Modern Science for music. Support comes from Harvard University, PwC, and Johnson & Johnson.
Expand
Highlights
Key takeaways from this module
Availability bias is the tendency to mistakenly use the ease with which information comes to our mind as a proxy for the frequency or truth of that information.
Planes may seem more dangerous than cars because plane crashes make national news, but planes are actually much safer!
To outsmart the availability bias, make sure your decisions are based on actual data rather than what comes most easily to mind.
Fact check yourself: actively search for data and double-check your assumptions with the numbers in front of you.
When selecting someone for an important new responsibility or an award, look at (and read!) the entire list of possible candidates.
Test your knowledge
Consider this experiment: people were asked to write down a random two-digit number on a blank sheet of paper. Then, they were asked to estimate the price of an object (such as a cordless keyboard). The experimenter found a positive correlation between the two numbers – the larger the random 2-digit number, the larger the estimated price of the keyboard.
What cognitive bias best explains this result?
What You've Learned
Our brains, which are capable of so many astonishing mental feats, are also prone to mistakes. The shortcuts we use to process information didn’t matter when the main challenge of our species was to survive to another day; these same shortcuts can lead us to irrational decisions in today’s information-heavy and globally connected society.
Some examples of the biases that lead us astray and cost us money, health, and happiness are:
- Confirmation bias – we seek confirming evidence and avoid disconfirming evidence
- Anchoring bias – we anchor ourselves to an initial (and sometimes irrelevant) piece of information
- Availability bias – the information that comes to mind fastest seems the most true
At the end of the day, our habits of thought are products of millions of years of evolution and will likely lead to biases. But we can outsmart them to make better decisions. The first step is to be able to detect and name our biases.
Reflections
Question1
Salary ranges are routinely mentioned during the hiring process. Think about how anchoring bias could affect where you end up settling on a salary for yourself based on these “anchors”. What is a good way for you to combat this bias so that you see what a fair salary might be? Are there other cognitive biases that may influence what salary you get?
What you write in this box is just for you. You will have a chance to downloadyour response, but it will not be stored on our servers.
Your response:
Our thoughts:
One way to remove the impact of the anchoring bias in salary negotiations is to collect data on what other people in the company received as a starting salary. You could also check what the average starting salary in your field is at other companies. Go in knowing the full range of possible salaries.
With this knowledge in mind, you will be better able to notice if you are being lowballed. Remember: a starting salary is an important predictor of where you will end up in the moment of retirement, so it’s in your best interest to do what you can to start at the highest possible salary.
Congratulations! You have completed Unit1:Common cognitive biases .
Dive deeper
Extra materials if you want to learn more
Have you ever taken a personality test such as the Myers-Briggs Type Indicator? You answer a list of questions, and in the end you get four letters that seem to describe you to a “T”. Here’s the problem: these letters apply to lots of people. But like weekly astrological predictions, when we read these general descriptions confirming our own personality or experiences, we don’t realize that they would seem equally real to everybody. And as we keep reading, we keep confirming – ignoring imprecisions and delighting in the accuracies. Personality tests are fine when they’re just a fun activity, but we can run into issues when we use them as a hiring and promotion tool. Read more at the New York Times.
The sequence 2, 4, 8 follows a rule. Can you figure out what it is? Try to outsmart the confirmation bias with this demo from the New York Times.
“We all have a few people in our social media networks who share ridiculous things [… b]ut most of us have also encountered well-informed, sane people who share articles that are blatantly incorrect propaganda. Why does this happen?” Read more in Jeff Stibel’s “Fake news: How our brains lead us into echo chambers that promote racism and sexism” (USA Today).
Warren Buffett is one the most successful investors in history. His secret? He “acknowledges that even his decisions could be swayed by [confirmation bias] – an important first step – and then gives voice to opinions that contradict his own.” Read more about how to think (and invest) like Buffet from Forbes.
“It’s easy to assume that presenting factual information will automatically change people’s minds, but messages can have complex, frustrating persuasive effects […] It turned out that climate change skeptics – whether politically conservative or liberal – showed more resistance to the stories that mentioned climate change. Climate change themes also made skeptics more likely to downplay the severity of the disasters. At the same time, the same articles made people who accept climate change perceive the hazards as more severe.” Read more from Professor Ryan Weber’s “Extreme weather news may not change climate change skeptics’ minds” at The Conversation.
Should your social security number influence how much you’d pay for a bottle of wine? Of course not! But behavioral economist Dan Ariely shows otherwise in a study where he auctioned off everyday items to students. Watch Dan’s explanation of the study here to learn how the anchoring bias can influence consumer decisions.
Often, the anchoring effect is discussed in the context of numbers: in our video, we saw how judges’ sentencing decisions could be influenced by even random numbers (like a roll of the dice). But we can anchor on ideas too. The podcast DDx explores what this can mean for doctors in their episode “Anchoring bias and the frequent flyer” featuring Dr. Gita Pensa.
The San Francisco Police Department will now limit their release of mug shots to cases that require public assistance or public warning. Why this shift? Every person who is arrested has their photo taken during the booking process. When released, these photos often remain in the public domain… even when charges are dropped or the individual is found not guilty. This can lead to a type of availability bias called an illusory correlation.
Vaccines have eradicated diseases. No scientific data shows a connection between vaccination and autism. So why is fear of autism motivating some parents’ decisions not to vaccinate their children? Read Sarah Watt’s article “5 cognitive biases that explain why people still don’t vaccinate” at Forbes.
There’s a common belief that women in their 30s stop working in order to start families. However, a global survey by the ICEDR found that men and women around 30 were leaving work at the same rates, and for the same reasons: to work for a competitor. Read more here.
Dr. Harrison Alter suspected pneumonia when Blanche Begaye arrived at the emergency room with breathing trouble even though tests didn’t reveal the typical symptoms. It wasn’t until Begaye was admitted that Alter realized his mistake: she was suffering from aspirin toxicity, not pneumonia at all! The culprit behind his mistake? Availability bias. Learn more at The New Yorker.
Want more examples of the Availability Bias in action? Business Insider sifts truth from availability here (featuring Professor Adam Grant).
Next unit:
Unit1:Common cognitive biases