How possible is it that everything we think we know is wrong?

How possible is it that everything we think we know is wrong?

A vibrant digital painting depicting a person standing on a winding path in a mountainous landscape, holding a map. The scene features colorful mountains, trees, and a bright sky.

Why “what if we’re wrong?” is a feature of thinking, not a bug

Big-Picture Framing
How possible is it that everything we think we know is wrong? This question sits at the heart of epistemology—the study of how we know what we know—and it quietly shapes how we learn, lead, and make decisions. Instead of treating it as a purely abstract fear, you can use it as a practical lens: our beliefs are like maps, not the territory, and every map leaves things out. In this post, we’ll explore how knowledge can be wrong yet still useful, when it’s likely to be overturned, and how to live productively with uncertainty. Along the way, you’ll get a mental toolkit for questioning assumptions without falling into paralysis or cynicism.


Why this question matters more than it seems

On the surface, “What if everything we know is wrong?” sounds like late-night dorm room philosophy.

But underneath, it’s a power question:

  • It shapes how humble we are about our beliefs.
  • It changes how aggressively we look for better ideas.
  • It influences how dogmatic or flexible we become in work, relationships, and politics.

Think of your worldview as the operating system of your mind. If you assume your OS is flawless, you stop updating it. If you assume it’s totally broken, you never use it. The sweet spot is realizing: it works well enough to run, but it’s full of bugs and patches we haven’t discovered yet.


Three ways our knowledge can be “wrong” yet useful

Most of the time, our beliefs aren’t 100% true or 100% false. They’re more like blurry photos: recognizable enough to be useful, fuzzy enough to mislead in the details.

1. Incomplete maps

A city map can be “wrong” because it doesn’t show every tree, alley, or café—but it still helps you navigate.

A lot of science, strategy, and everyday common sense works this way:

  • Newtonian physics is “wrong” at very high speeds or tiny scales, yet it’s perfectly good for bridges and buildings.
  • Your model of a colleague is incomplete, but still useful for collaborating day to day.

2. Distorted lenses

Sometimes the lens itself bends reality.

Cognitive biases, cultural narratives, and personal experiences act like filters that:

  • Highlight some information
  • Hide or distort other parts

Confirmation bias, for instance, makes us notice evidence that supports our view and ignore what doesn’t. The result: what we “know” is part reality, part projection.

3. Broken foundations

This is the scary version: the core assumption is wrong.

  • Ulcers were long blamed on stress and spicy food—until H. pylori bacteria reframed the story.
  • “Scientific” claims about race and gender were built on biased assumptions and bad data.

When foundations are wrong, we don’t just need a tweak; we need a rebuild. That’s rarer, but hugely consequential.


Real-world example: when “settled” knowledge shifts

For much of the early 20th century, many physicists believed the universe was static and eternal. Textbooks and expert consensus agreed.

Then evidence for an expanding universe appeared. Overnight, a central belief flipped:

  • The static-universe model wasn’t just incomplete—it was wrong at the core.
  • Cosmology had to be rewritten; whole research programs shifted.

A business version looks like this:

  • A company “knows” customers choose them for low prices.
  • A deeper analysis shows the real driver is trust and ease, not price.
  • Strategy changes: less discounting, more service and reliability.

In both cases, not everything was wrong. But a few key false assumptions quietly warped dozens of smaller beliefs.


The practical limits of skepticism

So what about radical doubt—ideas like “we might be in a simulation” or “maybe all of reality is an illusion”?

A few things to notice:

  • It usually doesn’t change your next move. Whether this is a simulation or not, you still need to eat, sleep, pay rent, ship products, and show up for people. Gravity “feels” the same either way.
  • Decisions run on relative, not absolute, certainty. You almost never get 100% proof; you make bets based on the best available model. Radical skepticism doesn’t give you a better model—it just tries to blow up all models at once.
  • Sustained hyper-skepticism is paralyzing. If you refuse to act until you’re absolutely sure, you never act. Productive doubt asks, “What’s good enough to move forward, and where would being wrong hurt most?”

So it’s useful to occasionally entertain radical doubt to stay humble—but day to day, the more practical stance is:

“Assume reality is roughly what it seems, stay alert for corrections, and design your life so you can update without collapsing.”


So how possible is it that everything is wrong?

Short answer:

  • It’s very unlikely that everything we think we know is wrong in the sense of “completely false.”
  • It’s almost certain that everything we think we know is incomplete, simplified, or partly distorted.

Gravity still makes apples fall. Chairs still hold you up. Your best friend probably still likes you. Everyday regularities in the world are powerful evidence that our maps, while imperfect, connect to something real.

The real danger is not “everything is wrong,” but:

  • We’re more wrong in complex, abstract domains (future predictions, human motivation, macroeconomics, long-term strategy).
  • We’re often more confident than our evidence justifies.

A more productive version of the question is:

Where am I most likely to be significantly wrong—and what’s the cost if I am?

From there, you can:

  • Hold beliefs with different “temperatures”
    • Cold (firm): “Gravity exists.”
    • Warm: “This business model will scale.”
    • Hot (tentative): “This new technology will definitely change everything.”
  • Design for being wrong in small ways
    • Run experiments instead of grand bets.
    • Pilot new ideas on small scales before rolling them out.
  • Seek disconfirming evidence on purpose
    • Ask: “What would I need to see to change my mind?”
    • Invite people who disagree with you and actually listen.

Instead of fearing that everything we know is wrong, treat “I might be wrong” as a tool: a way to debug your thinking, update your models, and make better calls under uncertainty.


Bringing it together

The possibility that “everything we think we know is wrong” isn’t a glitch in the system—it’s a reminder that we move through life with maps, not the territory. Our maps work just well enough to get things done, and just poorly enough that we need to keep revising them. If you treat your beliefs as version 1.0 software—useful but buggy—you become humbler, more curious, and more effective.

If this kind of question sharpens how you think, consider following QuestionClass’s Question-a-Day at questionclass.com. One good question a day is a surprisingly powerful way to keep updating your mental maps.


Bookmarked for You

Here are a few deeper dives if this question hooked you:

The Scout Mindset by Julia Galef – A practical guide to seeing the world as it is, not as you wish it to be, and updating your beliefs without ego getting in the way.

The Black Swan by Nassim Nicholas Taleb – Explores how rare, unexpected events expose the hidden fragility and overconfidence in our assumptions.

The Structure of Scientific Revolutions by Thomas S. Kuhn – A classic on how entire paradigms of “settled knowledge” periodically get overturned.


QuestionStrings to Practice

QuestionStrings are deliberately ordered sequences of questions in which each answer fuels the next, creating a compounding ladder of insight that drives progressively deeper understanding. Use this one to stress-test your own “knowledge” and decide where to update it next:

Assumption Stress-Test String
For when you want to see where you might be most wrong:

“What am I most certain is true about this situation?” →
“What evidence actually supports that, and how strong is it?” →
“What would I need to see to admit I’m wrong?” →
“Where would it be most costly if I were wrong?” →
“What small experiment or conversation could I run this week to test it?”

Try weaving this into your decision-making, 1:1s, or journaling. You’ll be surprised how often “solid knowledge” turns out to be an educated guess—one you can quickly improve.


In the end, this question doesn’t paralyze you; it trains you to be a better map-maker—one who can act decisively today while staying ready to rewrite the legend tomorrow.

Comments

Popular posts from this blog

How Do You Adapt Your Communication Style to Fit Your Audience?

What’s one habit you can develop to improve daily productivity?

How do I identify the right mentors & sponsors for my career?