After I published The end of not knowing, my dad wrote to me:
I don’t know how it deals with those - perhaps like your father - who will happily reinvent the truth, whether seriously or for a laugh. Not sure there aren’t many others with far greater power who will not only invent a truth but find others to not only endorse it, but believe it.
It was a good thought. My earlier essay argued how we've lost the habit of uncertainty. We don’t linger in doubt long enough to imagine, to try testing our own thinking. But dad’s note pointed out that there are other things that can fill that vacuum when uncertainty disappears: invention.
And not always the fun kind.
Perhaps we’ve moved from a world of shared not-knowing to one that isn’t just a single certainty, but competing certainties. Search engines erased factual doubt. AI can smooth over logical uncertainty. Instead of bringing us closer to the truth, we’ve multiplied realities. The problem isn’t ignorance - it’s conviction.
Reinventing the truth
My dad is a storyteller. He’ll tweak a memory, stretch a punchline, shift a detail. Sometimes for effect, sometimes for mischief. Sometimes because he’s forgotten the original. Everyone in the family knows, everyone (sometimes) plays along. More often we might groan because we’ve heard it before. That kind of truth-bending is social, generous, very human.
But if you scale that instinct for tall-tales, it can become something else. If millions of people are bending the truth at once, it stops becoming play and becomes persuasion. Algorithms reward conviction over accuracy. As the apocryphal Mark Twain quote goes:
A lie can travel halfway around the world while the truth is still putting on its shoes.
We don’t share uncertainty any more. But we do share the act (if not the facts) of belief.
In The end of not knowing, I said uncertainty matters because it can teach humility. But it’s pretty hard to sustain humility. Doubt requires effort. Certainty - true or false - feels better.
Especially when the world is unstable, confident simplicity is a comfort. We cling to what feels coherent with our worldview. And when those beliefs, perhaps especially the false beliefs, fuse with identity, correction feels like an attack.
We don’t debate to learn anymore. We defend our stance to preserve our identity.
Truth is based on allegiance, not evidence.
Machines that mirror us
AI didn’t invent this pattern, but it does mirror it. Systems designed to “find truth” are built to deliver a satisfying answer, not a shrug and an “I don’t know.” They try to collapse complexity into something that’s plausibly coherent.
This is an example of machines mimicking our own bad habits. They sound sure, even when they shouldn’t be. They make stuff up, rather than admit they don’t know. They give us back our confidence, polished up and packaged.
AI isn’t lying because it’s malicious. When it lies, it does so with conviction - because that’s what we asked it to do. AI is learning that, for humans, certainty sells.
The rediscovery of honest doubt
Maybe the next step after trying to rediscover uncertainty is to rediscover honesty.
That means admitting that not all of our “knowledge” is equal. Truth isn’t a possession, it’s a practice. It’s an act that’s been proven fairly fragile, dependent on curiosity and care.
I don’t think better tools will save us, however tempting that hope might be. We need something simpler, and much much rarer.
Moral doubt.
That’s a willingness to pause, and ask, “What if I’m wrong?”
If enough lies get told, and enough of them get believed, everyone ends up knowing differently.
The danger isn’t so much that the truth can be destroyed. That might trigger a search in its absence. The danger is that, for many, it can be comfortably replaced.
Our challenge isn’t to try and rebuild certainty. We should try to rebuild a space where truth and doubt can coexist. Where it’s OK for us to look at each other and say, “I don’t know. Let’s find out together.”
