Harrison Stoneham
The Hidden Cost of Certainty

The Hidden Cost of Certainty

8 min read

I used to think conviction was the most important trait in investing. Find an idea you believe in, bet big, and hold on.

Then I started paying attention to what actually separates people who build lasting wealth from people who blow up.

It wasn’t conviction. It was doubt.

The Paradox of Certainty

There’s a strange pattern in investing: the people who sound most confident often perform the worst.

Think about who gets invited on TV. Who writes the boldest headlines. Who builds the biggest followings. It’s always the person with the most certain prediction. “The market WILL crash.” “This stock WILL 10x.” “Inflation WILL destroy your savings.”

Certainty sells because uncertainty is uncomfortable. We’re wired to crave answers. Someone who says “I don’t know” doesn’t get retweeted. Someone who says “It depends” doesn’t get booked for interviews.

But here’s the thing: the world doesn’t care about our need for certainty. Markets are complex systems driven by billions of decisions made by people with different goals, different information, and different time horizons. Pretending you can predict what happens next isn’t confidence—it’s delusion dressed up as insight.

The Weather Forecaster Problem

Meteorologists have gotten remarkably good at short-term forecasts. A 3-day forecast today is as accurate as a 1-day forecast was in 1980. But here’s what’s interesting: they’ve gotten worse at pretending they can predict further out.

Modern weather apps show you probability ranges. “40% chance of rain.” “High between 72-78°F.” They’ve learned that communicating uncertainty is more honest—and more useful—than false precision.

Finance went the opposite direction.

Every January, Wall Street analysts publish their year-end S&P 500 targets. They give you a single number, like 5,200, with no error bars and no probability distribution. Just pure, unearned certainty.

The track record? Abysmal. These predictions have essentially zero correlation with actual outcomes. But they keep publishing them because certainty is what sells. Nobody wants to read “the market could be anywhere from down 15% to up 25%,” even though that’s far closer to the truth.

What Certainty Actually Costs

When you’re certain about something, you stop looking for evidence that you’re wrong.

This is the real cost—not that you might be wrong (everyone is wrong sometimes), but that certainty blinds you to the warning signs that would let you adjust.

Consider Long-Term Capital Management. The fund was run by Nobel laureates who had mathematically “proven” their strategies couldn’t fail. Their models showed the odds of their portfolio blowing up were essentially zero—like a once-in-a-billion-years event.

In 1998, the fund lost $4.6 billion in four months and nearly brought down the global financial system.

The math wasn’t wrong, exactly. The assumptions underneath the math were wrong. But certainty in the models prevented anyone from questioning the assumptions until it was too late.

Or take Kodak. They invented the digital camera in 1975. They knew—with certainty—that film was superior. Higher quality. Better colors. Established infrastructure. They were so certain that digital was a toy that they buried their own invention.

In each case, certainty wasn’t a strength. It was a vulnerability. It created a blind spot exactly where they needed to see most clearly.

The Superforecasters

In 2011, the U.S. intelligence community ran a tournament. They wanted to know: who makes the best predictions?

They tested thousands of people—academics, analysts, regular citizens—on hundreds of geopolitical questions. Will North Korea test a nuclear weapon? Will the Euro survive? Will there be a military conflict in the South China Sea?

The winners weren’t the experts you’d expect. They weren’t former CIA analysts or international relations professors. They were a ragtag group of curious amateurs who shared a particular mindset.

What made them different?

They held their beliefs loosely. They updated constantly. They thought in probabilities, not certainties. They sought out information that challenged their views rather than confirmed them.

The researcher Philip Tetlock called them “superforecasters.” Their distinguishing trait wasn’t intelligence or domain expertise. It was intellectual humility—the willingness to say “I was wrong” and adjust.

The worst forecasters? People with grand theories and strong convictions. They were more confident, more articulate, and more consistently wrong.

The Surgery Paradox

Here’s something counterintuitive from medicine: surgeons who express doubt before operating tend to have better outcomes than surgeons who express certainty.

It seems backwards. Wouldn’t you want the confident surgeon?

But doubt serves a function. The surgeon who says “this could be complicated” is mentally preparing for complications. They’re thinking through contingencies. They’re staying alert for things that might go wrong.

The surgeon who says “this will be routine” is priming themselves to miss the unexpected. They’ve already decided how the operation will go, which makes them slower to react when it doesn’t.

Doubt isn’t weakness. It’s preparation.

Process vs. Outcome Conviction

I’m not saying you should never have conviction. Some situations require commitment. But the question is: what kind of conviction?

There’s a difference between process conviction and outcome conviction.

Process conviction means you’re confident in how you make decisions. You trust your framework. You’ve tested your approach. You know it works over time even if it fails in any given instance.

Outcome conviction means you’re confident about what will happen. You think you know the future. You’ve attached your ego to a specific result.

The first kind of conviction is useful. The second kind is dangerous.

Consider a poker player. A great poker player has complete confidence in their strategy—when to raise, when to fold, how to size bets. But they have zero confidence in any individual hand. They know they’ll lose plenty of hands. That’s fine. Over thousands of hands, the edge plays out.

An amateur does the opposite. They have no consistent strategy, but they get emotionally attached to individual hands. They can’t fold pocket aces even when the board screams danger. Their outcome conviction overrides good process.

If your process is sound, you can be wrong about outcomes and still do fine. If your conviction is attached to specific outcomes, a single wrong call can devastate you.

The Practical Application

So how do you actually build this into how you invest and make decisions?

1. Separate what you can know from what you can’t.

Some things are knowable: what a company earns, what you’re paying for it, how much cash they have. Other things are unknowable: what the economy does next year, when the next recession hits, whether a specific catalyst will materialize.

The mistake is treating unknowable things like knowable things. Forecasting unknowable things with false precision is how people blow up.

2. Size your bets by your uncertainty.

If you’re 90% confident, bet accordingly. If you’re 60% confident, bet accordingly. The problem is most people bet the same size regardless of confidence level—or worse, they bet biggest on their most uncertain ideas because those have the highest upside.

The rule should be: the less certain you are, the smaller the bet.

3. Keep a decision journal.

Write down what you believe and why before you make big decisions. Be specific about your confidence level and what would change your mind.

Then go back and read it. You’ll be humbled by how often your certainty was misplaced. More importantly, you’ll start noticing patterns—what you’re reliably wrong about, where your blind spots are.

4. Seek out the strongest opposing argument.

Before committing to a position, find the smartest person who disagrees with you and try to understand their reasoning. Not to debunk it—to genuinely understand it.

If you can’t articulate the opposing view charitably, you don’t understand your own position well enough.

The Paradox Resolved

Here’s the strange thing: accepting uncertainty actually makes you more effective, not less.

When you stop pretending you know what will happen, you start preparing for what might happen. You build in room for error. You diversify. You keep reserves. You stay humble about predictions while staying confident in your process.

The old weather forecasters lost credibility by being overconfident. The new ones gained trust by communicating uncertainty honestly.

Investors can learn the same lesson. The goal isn’t to predict the future with certainty. It’s to make decisions that work out across many possible futures.

That requires holding your beliefs lightly. Updating when evidence changes. Betting small on uncertain things and big on more certain things.

The world rewards survival. And survival rewards doubt.


The best investors aren’t the ones with the strongest opinions. They’re the ones who hold their opinions the lightest.