Become a Patron!Support our reviews, videos, and podcasts on Patreon!
Cool tools really work.
A cool tool can be any book, gadget, software, video, map, hardware, material, or website that is tried and true. All reviews on this site are written by readers who have actually used the tool and others like it. Items can be either old or new as long as they are wonderful. We post things we like and ignore the rest. Suggestions for tools much better than what is recommended here are always wanted.
Cornell psychologist Thomas Gilovich examines the cognitive, social, and motivational processes that lead us to believe things that simply aren’t true — revealing that our false beliefs aren’t products of irrationality, but of flawed rationality applied to incomplete information.
Core Principles
We See Patterns in Randomness
Our brains are pattern-recognition machines that often work too well. We see meaningful clusters in random data, believe in “hot hands” in basketball when the streaks are statistically normal, and find significance in coincidences that are mathematically inevitable. The clustering illusion makes us trust our intuitions about randomness when we shouldn’t.
Confirmation Bias Shapes Everything
When examining evidence, we see what we expect to see and conclude what we expect to conclude. Information consistent with our existing beliefs is accepted at face value; evidence that contradicts it is scrutinized and discounted. Worse: for conclusions we want to be true, we ask “Can I believe this?” — but for unwelcome conclusions, we ask “Must I believe this?”
We Overestimate Agreement
The false consensus effect leads us to overestimate how much others share our beliefs. Because we associate with like-minded people and disagreement often stays hidden, we don’t subject our beliefs to healthy scrutiny. This social bubble reinforces false beliefs and makes them feel like common sense.
We’re Better at Generating Than Evaluating
Humans are extraordinarily good at generating ideas, theories, and explanations that sound plausible. We are far less skilled at rigorously testing them. We prefer black-and-white thinking over shades of gray, and we’ll always be tempted to hold oversimplified beliefs that feel satisfying even when reality is more complex.
Try It Now
Identify a belief you hold strongly. Now ask yourself: “What evidence would convince me this is wrong?” If you can’t name any, that’s a warning sign.
Think of a recent “streak” or “pattern” you noticed — in sports, luck, or daily life. Consider: Could this be random variation that I’m interpreting as meaningful?
Notice the next time you encounter information that supports your existing view. Pause and apply the same critical scrutiny you’d use for information that contradicts it.
Ask someone you trust but who thinks differently: “What do you believe about X that I probably don’t?” Listen without defending.
Before sharing a surprising “fact” today, ask yourself: “Did I verify this, or did I believe it because I wanted it to be true?”
Quote
“For desired conclusions, we ask ourselves, ‘Can I believe this?’, but for unpalatable conclusions we ask, ‘Must I believe this?’”
Michael Garfield, Paleontologist-Futurist
12/20/24 Picks and shownotes