Imagine someone described as quiet, bookish, and passionate about social justice. Is this person more likely to be a bank teller, or a bank teller who is active in the feminist movement? If you picked the second option, you're in good company. Most people do. But you'd be wrong — and the reason why reveals something important about how our minds handle probability.

This mistake is called the conjunction fallacy, and it's one of the most well-documented errors in human reasoning. It shows that our brains don't naturally think in terms of mathematical likelihood. Instead, we think in terms of stories. And the better a story fits, the more "true" it feels — even when the math says otherwise.

The Representativeness Heuristic: When Stereotypes Hijack Probability

The example above comes from a famous 1983 study by psychologists Daniel Kahneman and Amos Tversky. They introduced "Linda" — described as a bright, outspoken philosophy major concerned with discrimination and social justice — and asked participants which was more probable: that Linda is a bank teller, or that Linda is a bank teller and active in the feminist movement. Over 80% of respondents chose the second option.

The reason is something called the representativeness heuristic. When we assess probability, we often don't calculate — we compare. We ask, "How well does this description match my mental image of that category?" Linda's description matches "feminist bank teller" far better than plain "bank teller." So our brains label it as more likely. The stereotype becomes a substitute for actual probability.

But here's the mathematical reality that never changes: a specific subset of something can never be more probable than the larger category it belongs to. Every feminist bank teller is already a bank teller. So the group "bank tellers" will always be at least as large as "bank tellers who are feminists." The representativeness heuristic tricks us into ignoring this basic logical constraint, because matching a profile feels like evidence.

Takeaway

When a description seems to fit a category perfectly, treat that feeling as a warning, not confirmation. The better something matches a stereotype, the more likely you are to overestimate its probability.

Narrative Coherence: Why Good Stories Feel More True

The conjunction fallacy doesn't just exploit stereotypes — it exploits our deep love of coherent stories. When you add a specific detail to a scenario, you're not just adding information. You're adding narrative. And our minds are wired to find coherent narratives more believable than incomplete ones, even when the added detail mathematically reduces the probability.

Think of it this way. "A massive earthquake hits California" is a vague claim. "A massive earthquake hits California, causing a catastrophic flood" is more specific — and less probable, since it requires two events instead of one. Yet studies show people often rate the second scenario as more likely. The flood detail gives the earthquake consequences, a chain of cause and effect. It becomes a story. And stories feel real in ways that bare statistical claims do not.

This is a fundamental tension in how we process information. Plausibility and probability are not the same thing. A scenario can make perfect narrative sense — each detail logically following the last — while being far less likely than a simpler alternative. Every added detail, no matter how fitting, is an additional condition that must be true. Each one multiplies the ways the prediction can fail. Our storytelling instinct obscures this, because in stories, more detail means more richness. In probability, more detail means more risk of being wrong.

Takeaway

A story that makes sense is not the same as a prediction that's likely. Every detail you add to a scenario is another way it can turn out to be false — no matter how well those details hang together.

Probability Discipline: Training Your Intuition to Respect the Math

Knowing about the conjunction fallacy doesn't automatically fix it. Even Kahneman himself admitted that his intuitions still pull toward the wrong answer. The key isn't eliminating the instinct — it's building a habit of checking it. When a specific, detailed claim feels obviously true, that's the moment to slow down and ask: "Am I evaluating probability, or am I just rating how good this story sounds?"

One practical technique is what you might call the subset test. Whenever you're comparing two possibilities and one is a more specific version of the other, flag it. "Is this company going bankrupt" versus "Is this company going bankrupt because of a scandal" — the second is always less likely or equally likely, never more. If your gut says otherwise, your gut is responding to narrative coherence, not mathematical reality.

This kind of discipline matters well beyond psychology experiments. It applies to medical diagnoses, legal reasoning, political predictions, and everyday judgments about people. Any time you hear a detailed, compelling scenario presented as the most probable outcome, you have a reason to be skeptical. The world is messier and more random than our stories suggest. Simpler claims deserve more respect than elaborate ones — not because simplicity is always right, but because it carries fewer built-in assumptions that can each go wrong.

Takeaway

When a detailed prediction feels more convincing than a simple one, pause and apply the subset test. If one scenario is a specific version of the other, it cannot be more probable. Use that rule as a guardrail against your own storytelling instincts.

The conjunction fallacy teaches us something humbling: our sense of what's probable is deeply entangled with our sense of what makes a good story. These are different skills, and they often point in opposite directions. Recognizing this gap is the first step toward more careful thinking.

Next time a detailed scenario strikes you as obviously likely, let that confidence be your cue to pause. Ask whether you're calculating or narrating. The math doesn't care how compelling the story is — and that's exactly why it's worth consulting.