Information Literacy

In This Chapter

Learning Objectives

  • Understand how our existing knowledge and mental shortcuts are likely to affect our research process
  • Identify strategies for overcoming our cognitive biases

Summary

Cognitive biases are shortcuts that help us deal with the huge amount of information in our environments, but they can hurt our research processes. Confirmation bias means we are more likely to search for and notice information that confirms our existing beliefs. Motivated reasoning means we think more critically about information that challenges our beliefs than we do about information that confirms our beliefs. A mindset based on curiosity and openness will help us mitigate our cognitive biases and see that changing our minds based on new information isn’t the end of the world.

“The brain is a machine for jumping to conclusions.” ~Daniel Kahneman[1]

Every human is inundated with information every moment of their life. Not just the research information we talk about in this book, but all sorts of information. All the light that hits our eyes and tells us where things are in the room, every noise we hear, every clue from our environment. It is not possible to think critically about every piece of information that comes our way, and we don’t need or want to.

We have all developed mental shortcuts to deal with this information overload. Researcher Daniel Kahneman describes the situation this way: we have two systems for processing information, the first, System 1, that runs effortlessly and involuntarily and a second one, System 2, that takes more effort to use.

Let’s see if we can catch these systems at work. Try out this problem:

A bat and ball together cost $1.10 and the bat costs $1 more than the ball. How much does each one cost on its own?

Many people get this answer wrong because they use a mental shortcut without even realizing it. (The correct answer and an explanation is available here.) To get to the right answer we have to resist the tempting, easy answer offered by System 1, and engage the careful thinking of System 2.

The System 1 runs automatically; it is in charge most of the time and it is the system that helps us make quick judgements about the vast majority of input coming into our brains. System 2 takes effort to use, is only activated occasionally, but it can do much more complicated things. The fact that we use System 1 most of the time isn’t a problem, it’s our brains working as they should. They help us not expend energy needlessly on things that are unlikely to be important. We’re not bad people because we jump to conclusions much of the time. But the effects of System 1 extend into our research activities in ways that can lead us to make mistakes.

Cognitive Biases

There are many different mental shortcuts our brains take when System 1 is activated; we call these cognitive biases. Cognition refers to a variety of thinking processes such as attention, perception, memory, and reasoning that we use to gain knowledge and understanding.  “Cognitive biases” does not refer to biases related to a particular opinion, rather it refers to biases that all of our brains have to behave in a certain way as we try to make sense of the world. There are a lot of different cognitive biases, but here we will just mention two that are quite likely to impact our research activities.

Confirmation Bias

Confirmation bias relates to the information we look for, notice and remember. Confirmation bias makes us more likely to notice information that confirms our existing opinions and beliefs. If we’re scanning a list of search results, the sources that support our existing beliefs are more likely to jump out at us; they are familiar and require less energy to process. Similarly, we are likely to not notice results that run counter to our existing ideas.

Sometimes people use this property of search terms to steer you toward a particular conclusion. If someone online is telling you to “do your own research” while at the same time repeating words and phrases that are only used by proponents of one side of an issue, they don’t really want you to find a variety of perspectives, just the one associated with those terms.

But the part of confirmation bias that is most likely to affect our research is how it affects our choice of search terms. We can’t help but choose search terms based on what we are already thinking. And it’s not surprising that if you search for why Marvel is better than DC you will find info that supports this claim. Google (or your search engine of choice) tries its hardest to match the words in the search box with words on websites. The sources that most closely match your search terms are also very likely to share the opinion that was embedded in your search terms.

Similarly, if you search is lab testing on animals wrong, you get very different search results than you would if you search benefits of animal testing.

But even when we don’t type value words like better, wrong, or benefits into the search box, some words are more associated with one position than another and are likely to return results associated with only one position. For example, compare the results you get for the search illegal aliens voter fraud with those for immigrant voting rights. Both searches involve the concept of people born outside a country participating in elections, but the pictures you get from the results are very different.

Motivated Reasoning

Motivated reasoning relates to how we think about a piece of information. Motivated reasoning happens when we readily accept pieces of information that support our existing beliefs without much thought, and we expend more effort to think critically about pieces of information that challenge our existing beliefs or behaviors. Examples of this are everywhere, but my favorite is a study that asked people to read and evaluate a research paper that suggested that drinking coffee had negative health effects. Coffee drinkers were much more likely to question the validity of the research than non-coffee drinkers.

This is a mental shortcut that saves a lot of time and mental energy. Many things we believe don’t need to be reexamined on a regular basis, and it would be a big energy drain if we did. Generally, we all believe that brushing our teeth is a good idea. There’s not much to be gained by critically examining another piece of evidence that brushing your teeth is good. So we skip over thinking about information that matches up with what we already believe.

On the other hand, when those coffee drinkers were given information that suggested their habit might be bad for their health, their only choices were to accept the information (and either deal with an uncomfortable idea or try to break the habit), or find a reason to reject the information. The easier choice in that situation was to work a little harder to find a reason to discredit the information.

These coffee drinkers aren’t bad people, and they weren’t consciously deciding to think critically about the research in front of them. These shortcuts are just part of being human, and there’s no need to attempt the impossible task of avoiding them entirely. It is very hard to notice motivated reasoning in ourselves because this is an automatic process. But motivated reasoning does have big implications for research because it tends to reinforce what we already believe.

What Can We Do About Cognitive Biases?

In the video below, Julia Galef mentions several things that can help us overcome our cognitive biases and improve our judgement.

 

The first step is one you’ve already done: become aware that these cognitive biases exist. Just knowing that motivated reasoning is a thing won’t magically prevent your brain from taking this shortcut, but you can’t compensate for it until you know it exists. This is part of increasing your metacognitive awareness.

Be kind to yourself and others during the research process. Galef mentions the strong tendency in our society to criticize people who change their minds. We seem to have adopted the strange idea that revising your positions is a sign of weakness, that there is nothing worse than being wrong. We can change this perception by giving each other space to grow and revise our ideas and by expressing approval when people demonstrate a willingness to revise their opinions after careful consideration of new information. We don’t need to tie our self-worth or the worth of others to how right or wrong we or they are on a particular issue.

Galef also mentions that feelings play a big part in how we deal with new information. Notice your feelings as they come up during your research. In the chapter, SIFTing Information, we talk about how having a strong positive or negative emotional reaction to a piece of information is an important cue to stop and check that piece of information. The Oatmeal comic strip creator, Matthew Inman has an entertaining assessment of our emotional reactions to new information (10 minute read):

 

A cartoon of two birds on a branch. One says to the other, "You're not going to beliefe these things I tell you."
George Washington’s Teeth, The Oatmeal, https://theoatmeal.com/comics/believe_clean

Be aware of your feelings as they come up, but don’t let them have the final say. As Galef suggests, try to cultivate feelings of curiosity, openness, and groundedness.

Another practice that can help us overcome our cognitive biases is actively seeking out disconfirming ideas. Researcher Sonke Ahrens[2] suggests that instead of asking whether a source will reinforce the position we hold, we try asking whether it is relevant to the topic. If you are someone who struggles to find something to write about, this shift in perspective can have an added benefit: working through the tension between the ideas you started with and the sources that challenge those ideas will give you things to write about.

“Do you yearn to defend your own beliefs?  Or do you yearn to see the world as clearly as you possibly can?” ~Julia Galef

 

Reflection & Discussion Question 1: Taking Stock of What You Already Know

Let’s consider what you already know about your wicked problem. You may be surprised at how little or how much you already know, but either way you will become more aware of your own background on the topic, and therefore more aware of what direction your cognitive biases might nudge you in.

Construct a chart using the following directions:

  • In the first column, list what you know about your topic.
  • In the second column, briefly explain how you know this. (Heard it from a friend or family member, read it in a book, saw it on a blog, etc.)
  • In the last column, rate your confidence in that knowledge on a scale of 1 (least confident) to 10 (most confident).
  • Look over your chart and compare columns 2 and 3. Select three rows and for each one, write a reflection on whether the source of information justifies your confidence level.
  • Underline any information that you think might need to be checked or that you would like to find additional sources on.

Reflection & Discussion Question 2: Mindmapping

Confirmation bias can be hard to overcome because by definition it makes us less likely to even notice certain kinds of information. In this exercise we’ll try to find some our blind spots by creating and sharing mindmaps.

  1. Create a mindmap of your wicked problem. Include as many different aspects of the problem as you can: different issues, possible solutions, challenges, anything that comes to mind.
  2. Trade papers with a partner. Review their mindmap for aspects of your wicked problem you may have overlooked and add any aspects that you thought of that aren’t there.
  3. Trade papers with another pair, and repeat the process.
  4. Get back your original paper and consider the additions made by your classmates. What was added that you hadn’t thought of?  Did anything about the additions surprise you?

Reflection & Discussion Question 3: Loaded Search Terms

In the confirmation bias section above we saw examples of search terms that return a limited range of viewpoints. For most topics it is possible to find “loaded” search terms that have this effect.

  • What search terms related to your wicked problem are likely to return one particular viewpoint?
  • What neutral search terms can you think of related to your wicked problem? Are any search terms really neutral?
  • Should we always avoid loaded search terms, or are there times when they’re ok to use?

Reflection & Discussion Question 4: Thesis Statements

Have you ever had an assignment that required you to come up with a thesis statement before proceeding with your research?  It’s a common approach. Often what gets lost in these assignments is that the goal is to test the thesis, not justify it. But many of our mental shortcuts nudge us towards maintaining our existing positions.

  • Think about how you usually approach assignments. Are you more likely to start with a thesis statement or more likely to start with a question? What do you like about your approach?  What do you not like about it?
  • In the past how have you handled it when you encountered information that argued against your thesis statement?  Do you think you will handle it differently based on the ideas in this chapter?  Why or why not?
  • What kinds of assignments or instructions would make you more likely to be open to revising your original position or thesis statement?

Reflection & Discussion Question 5: Algorithms & Cognitive Biases

In the Introduction chapter we mentioned some ways that algorithms determine what information we see online.

  • What connections do you see between our cognitive biases and an information environment driven by algorithms?  In what ways do they work together?
  • What actions do you take or can you think of to break out of the “filter bubbles” and “echo chambers” we find ourselves in?

 


  1. Content and quotes in this section are from Daniel Kahneman's 2011 book, Thinking Fast and Slow.
  2. I highly recommend Sonke Ahrens' 2017 book How to Take Smart Notes

License

Icon for the Creative Commons Attribution 4.0 International License

Tackling Wicked Problems Copyright © by Members of the TWP Community is licensed under a Creative Commons Attribution 4.0 International License, except where otherwise noted.

Share This Book