Bias on the brain: A Yale psychologist examines common ‘thinking problems’

In her new book, “Thinking 101: How to Reason Better to Live Better,” Woo-kyoung Ahn explores so-called “reasoning fallacies” and how they affect our lives.
Woo-kyoung Ahn

Woo-kyoung Ahn (Credit: Studio DUDA Photography)

The sometimes counterintuitive ways that our brains work can raise big questions. Why is it that we procrastinate, dragging our feet when we know we will regret it later? Why are misunderstandings and miscommunications so common? And why do people often turn a blind eye to evidence that contradicts their beliefs?

For Yale’s Woo-kyoung Ahn, the John Hay Whitney Professor of Psychology, a better understanding of these kinds of questions is crucial.

In her new book, “Thinking 101: How to Reason Better to Live Better,” Ahn explores the ins and outs of so-called “reasoning fallacies” — or, as she describes them, “thinking problems” —  and how they affect our lives, from influencing how we view time to why we stereotype and profile other people. Her work examines the inherent complexities in the science of how we think and ultimately promotes solutions to overcome these reasoning fallacies.

The book is based on lessons Ahn teaches in her undergraduate course “Thinking,” which has become one of Yale’s most popular courses.

In an interview with Yale News, Ahn discusses the nuances of these “thinking problems” and the steps we should take to reduce biases. The interview has been edited and condensed.

The book is based on your popular course, “Thinking.” What inspired you to create the class itself — and to condense a semester-long course into a book?

Woo-kyoung Ahn: I have been teaching over 30 years and I have always covered some of these materials in various courses, and I was also teaching a seminar and an upper-level psych course on thinking too. But in 2016, I felt that it was time to disseminate this content more broadly for those who are not majoring in psychology.

The inspiration is this: it’s now quite well known that people commit a variety of reasoning fallacies, but they have been mostly discussed in the context of behavioral economics, or, to put it differently, psychology in the economic context. For example, people talked about the “negativity bias” — the bias to overweigh negative information compared to positive [information] — in terms of loss aversion in transaction or endowment effects. But I believe these irrational behaviors influence us in other situations in our everyday life. And I wanted to teach that rationality matters not just in dealing with money [and other material things] but much more broadly.

Another inspiration is that there has not been much discussion about what to do next once we notice that we commit thinking biases. Merely recognizing them isn’t enough; it’s like one can recognize that they have insomnia, but that’s not enough to cure insomnia. So, I’ve provided as many actional strategies as possible.

Which concepts did you choose to highlight in the book – and how did you choose to focus on them?

Ahn: In the course I cover many topics, like creativity, moral judgments, effects of language on thoughts, among many others — which I might, or might not, cover in a book at another time. But to be honest, I really have no good answer as to why these were chosen; I could have written a completely different book with the same title! One thing that I tried was not to cover too much in a single book. There are now websites with titles like “61 Cognitive Biases.” I just don’t want to overwhelm the readers. I also cover in the course more technical issues, [like] models of causal learning or mathematical proofs for irrational choices, which I don’t think was needed for a book written for the general public.

One valuable insight that became a running theme for the book is that these thinking “problems” aren’t really about what is wrong with us. I’ve mentioned this issue in my course at times, but I became more convinced about it while writing the book.

How would you define or diagnose these “thinking problems”?

Ahn: The kind of bad thinking that I care most about is unfair thinking. We can be unfair to ourselves and also to others when we are inconsistent, biased, overconfident, or underconfident.

For example, I don’t think the well-known confirmation bias is necessarily a thinking problem. 

It refers to a tendency to confirm what we hypothesize or what we believe. That is, we tend to search for evidence that supports what we believe or interpret evidence to fit with what we know. That sounds pretty bad, but it is actually a quite adaptive mechanism. For instance, you go to a Stop & Shop three miles away from your house, and you find that their apples are good. Next time when you want apples, you could try a different supermarket, but if your goal is to just get good apples, you might as well go to the same supermarket again. That is, when our goal is to survive, it’s a better bet to continue with what you know rather than to try to explore other possibilities.

The confirmation bias becomes a thinking problem when it makes us draw conclusions that are unfair to ourselves or others. For example, let’s say a CEO of a company hires only white people for their top executive positions. They all do a reasonably good job, so now the CEO believes that race matters and continues hiring only white people.  However, this CEO hasn’t even checked what would have happened if non-white people had been hired for those positions.

The confirmation bias can even hurt those who commit it. An experiment I conducted illustrates this. In the experiment, participants carried out a saliva test. Half the participants were told that the test results indicated that they have elevated genetic risks for major depressive disorder, and the other half were told that they don’t have the risks. Then, we asked all participants about the symptoms of depression they experienced in the past two weeks. These participants were randomly assigned to one of the two conditions, so there’s no reason why one group would have been more depressed for the last two weeks than the other.

But those who learned that they have genetic risks for major depression reported that they were significantly more depressed, and their average score is higher than what could be clinically considered mild depression.

One recurring theme of the book is how we might better communicate ideas, especially when there is evidence that using metrics — like data, statistics, et cetera — may be ineffective. How do you think public messaging campaigns can best incorporate some of these ideas and concepts?

Ahn: I think public messaging campaigns should acknowledge how we’re wired. For instance, when a charity organization asks for donations, they shouldn’t use just statistics or abstract data, but also an anecdote of a person who suffers from the issues that they are concerned about. In my recent study, we presented highly disgusting pictures of COVID-19 to the participants: pictures of COVID toes, burials of those who died of COVID, et cetera. As we know, politically conservative people were generally less willing to comply with the [Centers for Disease Control and Prevention] guidelines than politically liberal people, at least at the time of the study. However, when presented with these vivid examples of those who suffer from COVID-19, they became much more willing to comply with the guidelines.

You seem to be a big proponent of promoting greater dialogue between people, in that doing so will help combat many of these cognitive biases we have.

Ahn: Again, I tried to underscore that those who commit cognitive biases are not bad people; these errors are part of our highly adaptive cognitive mechanisms. Let’s return to confirmation bias again to illustrate this one more time. My favorite example is what happened when my son was four years old. He asked me why a yellow traffic light is called a “yellow” light. I didn’t understand the question but was patient enough to tell him that it’s called a yellow light because it’s yellow. Then he told me it’s [not yellow, but] orange. I said no way. He insisted that I just look at it, so I did. And it is orange.

I had no ulterior motives to call it a yellow light or to interpret the amber color as being yellow, but because everybody called it a yellow light, I saw it as being yellow all my life until he pointed it out. But what I committed was confirmation bias; I interpreted the color of a traffic light in light of what I already believe. We do this every moment in our lives. Thus, we shouldn’t think that those who disagree with our views are different kinds of people; they are just seeing the world from their point of view.

Of course, there are people with ulterior motives and self-righteous people, but if we are to start a dialogue, we should first recognize [the biases] we all share.

Also, because these biases are essential components of how we survive, they are not easy to counteract. So, sometimes, I do present actionable strategies — don’t guess what others might like, ask! — but in some cases, we may just need to try to focus on solving problems at hand rather than trying to change others.

Share this with Facebook Share this with X Share this with LinkedIn Share this with Email Print this