Skip to content
Think For Yourself

Fallacies in Your Own Thinking

Ages 10–14 25 min read Intermediate

Here's the uncomfortable truth: the most dangerous fallacies aren't the ones other people use on you. They're the ones you use on yourself — without realising it.

We all have biases that warp our thinking. Recognising them in yourself is the hardest and most valuable form of critical thinking.

The Biases You Don't Know You Have

Confirmation Bias

The big one. You naturally seek information that confirms what you already believe and ignore or dismiss information that contradicts it. If you think vaccines are dangerous, you'll notice every negative story and dismiss every positive study.

Everyone has this. Everyone. The only defence is to actively seek out information that challenges your beliefs.

The Dunning-Kruger Effect

People who know very little about a subject tend to overestimate their knowledge. People who know a lot tend to underestimate theirs. The less you know, the more confident you feel. This is why the loudest voice in the room is often the least informed.

Availability Bias

You judge the likelihood of events based on how easily you can recall examples. Plane crashes are memorable, so people fear flying — even though driving is 100 times more dangerous per kilometre (BITRE).

Sunk Cost Fallacy

"I've already watched 2 hours of this terrible movie, so I should finish it." The time already spent is gone — it shouldn't influence whether you continue. Yet we keep investing in failing decisions because of what we've already put in.

The Backfire Effect

When confronted with evidence that contradicts a deeply held belief, some people believe their original position even more strongly. Challenging someone's core beliefs can make them dig in deeper.

What To Do About It

  1. Actively seek disagreement. Read sources you disagree with. Follow people with different views.
  2. Ask: "What would change my mind?" If the answer is "nothing," you're not thinking — you're believing.
  3. Be comfortable saying "I don't know." It's one of the most honest and underrated phrases.
  4. Check your confidence. Are you sure because of evidence, or because of feeling?

Tonight's Question

"What's something you believe strongly? Now honestly ask: what evidence would change your mind? If the answer is 'nothing,' what does that tell you?"

This is genuinely hard. But it's the heart of critical thinking.

The Belief Challenge

  1. Each person writes down one strong belief (about anything — food, politics, school, etc.).
  2. Now each person must research the STRONGEST argument AGAINST their own belief.
  3. Present the counter-argument to the family.
  4. Discuss: did researching the other side change your view at all? Even slightly?
  5. It's okay if it didn't — but the practice of looking is what matters.

Go Further

  • Test: Take the "Implicit Association Test" at implicit.harvard.edu — discover biases you didn't know you had.
  • Book: You Are Not So Smart by David McRaney (2011) — entertaining guide to self-deception.
  • Practice: For one week, before sharing any opinion, ask yourself: "What is this based on? Have I checked?"
  • Question: Is it possible to be completely unbiased? Or is the goal to be aware of your biases?

What We Simplified

  • The backfire effect may be less common than initially thought. Recent research suggests most people DO update their beliefs when presented with strong evidence. The backfire effect is real but not universal.
  • Confirmation bias has positive aspects. It helps us maintain coherent worldviews and make quick decisions. The issue is when it prevents learning.
  • Self-awareness has limits. You can't fully eliminate your biases through awareness alone. Systems and institutions (peer review, editorial standards) exist to catch biases that individuals miss.

Sources

  • Kruger, J. & Dunning, D. (1999). "Unskilled and Unaware of It." JPSP, 77(6), 1121-1134.
  • McRaney, D. (2011). You Are Not So Smart. Gotham Books.
  • Nickerson, R.S. (1998). "Confirmation Bias: A Ubiquitous Phenomenon in Many Guises." Review of General Psychology, 2(2), 175-220.
  • Wood, T. & Porter, E. (2019). "The Elusive Backfire Effect." Political Behavior, 41, 135-163.

Want to track progress and save lessons?

Create a free family account. No credit card, no catch — just a place to keep track of what your family is learning.

Create Free Account