Getting Fast Thinkers to Slow Down
Talking students through how the brain works—its shortcuts and tendency to draw incorrect conclusions based on limited information—can help them study and learn better.
Try solving this simple problem with your students: A bat and a ball together cost $1.10. The bat costs a dollar more than the ball. How much does the ball cost?
Most will respond quickly that the ball must cost 10 cents. That’s incorrect. The ball costs 5 cents.
“We know a significant fact about anyone who says that the ball costs 10 cents,” writes Nobel Prize–winning psychologist Daniel Kahneman in his seminal 2011 book, Thinking, Fast and Slow. “That person did not actively check whether the answer was correct.” We also know they relied on instinct to rapidly deliver an answer—what Kahneman calls fast thinking.
At any given moment, our brains encounter an immense amount of stimuli to which we react intuitively—like deciding in an instant if someone’s tone denotes anger or confusion, or quickly understanding simple sentences as we read. These moments of fast thinking are near-automatic thought processes that require very little effort.
This type of instinctive, reactive thinking is essential for survival, but an overreliance on fast thinking can also lead to errors and bias, Kahneman contends. Slow thinking, by contrast, is analytical and deliberate, like when students raise their hand instead of impulsively calling out an answer, or they pause to solve a math problem that requires some degree of computational work, like 15 x 42.
But overconfidence has a tendency to creep in during learning, tricking students into believing that their first reflexive response is correct. Our brains have to work harder and expend more energy to think slowly, one of the main reasons that fast thinking can become a go-to cognitive reflex. In the rush to answer a question or solve a problem, fast thinking can lead students astray, causing them to reach conclusions based on incomplete information or make snap judgments influenced by cognitive bias.
Teaching students how the brain is wired for decision-making is an important part of creating conscious learners who can develop nuanced opinions and make smart, informed choices, says Renee Hobbs, a researcher and professor of communication studies at the University of Rhode Island. Educators can help students learn how to navigate the cognitive tricks our minds can play on us while instilling a sense of intellectual humility—an underemphasized life skill that combats overconfidence and encourages curiosity and vigilance when faced with new information or uncertainty. “That awareness propels intellectual curiosity while also leaving them with an appreciation for what they don’t know,” Hobbs says.
Here are four strategies to help students understand the benefits of slowing down their thinking:
Identify stealthy brain tricks: Despite having limited knowledge of a subject, kids often trick themselves into overestimating their mastery of concepts and ideas—a tendency known as the Dunning-Kruger effect. Students’ first instincts about what they do or don’t know are often wrong: guessing that the ball cost 10 cents without slowing down to check their work, for instance. The problem is, on top of being incorrect, the mind tricks them into feeling confident that they are right.
Before exams, Woo-Kyoung Ahn, a psychology professor at Yale University, directs her students to explain aloud concepts they’ve learned to a friend or family member, as if the other person has never encountered this information. That process of explanation can open students’ eyes to what they know and what they don’t, Ahn says.
“In an experiment, researchers asked, ‘Do you know how a toilet works, how a helicopter works?’ and so on. Of course, subjects can’t build one from scratch, but they know there’s a propeller at the top,” she explains. Still, the subjects respond affirmatively and provide a brief description. Next, they’re asked to write out a detailed explanation of how these things actually work—it can be quite clarifying. “That’s enough for them to realize they didn’t quite understand what they thought they did,” Ahn says.
Before an exam, students may feel like they understand all the materials because they skimmed their notes and highlighted key terms, but when they’re asked to articulate exactly what they claim to know, they come up short.
Making students aware of the Dunning-Kruger effect—in addition to creating more opportunities for students to actively demonstrate their learning and incorporating occasional checks for understanding throughout the unit—provides kids with a more nuanced view of what they know, as well as what they only sort of know or don’t know at all. This identifies areas they should slow down and target, before and during test prep.
Connect to students’ own lives: Being able to recognize examples of cognitive bias in their own lives, Hobbs says, puts students on a path toward making better, more strategic choices.
Instead of providing examples from your own adult life, Hobbs suggests, have kids come up with examples of cognitive bias—digital amnesia, in-group favoritism, and authority bias, for example—“from their own lives and write those down as stories,” she says. "Those are going to be super-powerful for helping students build that awareness where they understand ‘Here’s how my brain is biased to work, and here’s how I experience that in daily life.’”
Middle school history teacher Jordan Mattox uses the Gulf of Tonkin incident to illustrate real-world confirmation bias, the propensity of people to seek out information or evidence that confirms their own opinions, beliefs, and values. “The lesson guides students through how President Johnson looked for a way to justify his invasion plans,” Mattox writes. “Students connect Johnson’s actions to modern examples of confirmation bias by looking at Russian involvement in the 2016 election through social media.” Students are then encouraged to reflect on how confirmation bias can, and has, affected them while using social media.
Instill healthy skepticism: Teaching students to exercise a degree of caution when approaching new information online—instead of accepting everything they read as fact—is important, explains Julie Coiro, an associate professor of education at the University of Rhode Island. To strengthen her students’ abilities to critically analyze and parse information during online research before jumping to quick conclusions, Coiro asks them to consider the following prompts:
- What is the purpose of this site?
- Who created the information at this site, and what is this person’s level of expertise?
- Where can I go to check the accuracy of this information?
- Why did this person or group put this information on the internet?
In addition to explaining the concept of echo chambers—an environment where a person solely encounters information or perspectives that mirror and support their own—and the power they have to perpetuate misinformation and warp people’s perspective, middle school history teacher Chris Orlando suggests that students slow down and ask the following questions when consuming information:
- Does the source give only one perspective of an issue?
- Is that perspective primarily supported by rumor or partial evidence?
- Are facts ignored whenever they oppose that viewpoint?
Hold each other accountable: People—not just kids—have a tendency to forget where they learned something as soon as they learn it, which is known in psychology as source monitoring bias. “We stumble upon a piece of media content, take it in, and completely forget where we got it from,” Hobbs explains. “Then we come up with an idea and we think it’s our own, but we’ve just forgotten where we read it the first time.”
In our media-saturated culture, this has serious negative impacts, and in the classroom it can even lead to inadvertent plagiarism. While educators can recommend that students pay more attention to the source in the first place, that won’t solve the problem entirely.
To encourage slow thinking when someone shares a fact they’ve read, heard, or seen somewhere, Hobbs suggests encouraging students to ask the person, ‘Where did you hear that from?’ in an attempt to connect the piece of information to its original source, instead of simply accepting it without question. There’s a big difference between something a student read in a New York Times article versus a conspiracy theory video on YouTube.
“It’s important for students to learn that these cognitive biases aren’t things that we have to figure out on our own—we can help each other,” Hobbs says. “Rely on other people to help you slow down by asking you questions about the basis of your beliefs, or how and why you came to have certain attitudes and values.”