Tips

Book Freak #198: How We Know What Isn’t So

The Fallibility of Human Reason in Everyday Life

Get How We Know What Isn’t So

Cornell psychologist Thomas Gilovich examines the cognitive, social, and motivational processes that lead us to believe things that simply aren’t true — revealing that our false beliefs aren’t products of irrationality, but of flawed rationality applied to incomplete information.

Core Principles

We See Patterns in Randomness

Our brains are pattern-recognition machines that often work too well. We see meaningful clusters in random data, believe in “hot hands” in basketball when the streaks are statistically normal, and find significance in coincidences that are mathematically inevitable. The clustering illusion makes us trust our intuitions about randomness when we shouldn’t.

Confirmation Bias Shapes Everything

When examining evidence, we see what we expect to see and conclude what we expect to conclude. Information consistent with our existing beliefs is accepted at face value; evidence that contradicts it is scrutinized and discounted. Worse: for conclusions we want to be true, we ask “Can I believe this?” — but for unwelcome conclusions, we ask “Must I believe this?”

We Overestimate Agreement

The false consensus effect leads us to overestimate how much others share our beliefs. Because we associate with like-minded people and disagreement often stays hidden, we don’t subject our beliefs to healthy scrutiny. This social bubble reinforces false beliefs and makes them feel like common sense.

We’re Better at Generating Than Evaluating

Humans are extraordinarily good at generating ideas, theories, and explanations that sound plausible. We are far less skilled at rigorously testing them. We prefer black-and-white thinking over shades of gray, and we’ll always be tempted to hold oversimplified beliefs that feel satisfying even when reality is more complex.

Try It Now

  1. Identify a belief you hold strongly. Now ask yourself: “What evidence would convince me this is wrong?” If you can’t name any, that’s a warning sign.
  2. Think of a recent “streak” or “pattern” you noticed — in sports, luck, or daily life. Consider: Could this be random variation that I’m interpreting as meaningful?
  3. Notice the next time you encounter information that supports your existing view. Pause and apply the same critical scrutiny you’d use for information that contradicts it.
  4. Ask someone you trust but who thinks differently: “What do you believe about X that I probably don’t?” Listen without defending.
  5. Before sharing a surprising “fact” today, ask yourself: “Did I verify this, or did I believe it because I wanted it to be true?”

Quote

“For desired conclusions, we ask ourselves, ‘Can I believe this?’, but for unpalatable conclusions we ask, ‘Must I believe this?’”


Book Freak is published by Cool Tools Lab, a small company of three people. We also run Recomendo, the Cool Tools website, a YouTube channel and podcast, and other newsletters, including Recomendo DealsGar’s Tips & ToolsNomadicoWhat’s in my NOW?Tools for PossibilitiesBooks That Belong On Paper, and Book Freak.

02/27/26
[contextly_main_module]

© 2022