The system builds what the system builds
Conway's Law, and the structures we find so difficult to escape
We all know the quote. Widely misattributed, endlessly recycled.
Those who cannot remember the past are condemned to repeat it.
It’s deployed as a warning. A call to learn. To notice, and to break the pattern.
I don’t think we ever do. Not really.
We repeat it constantly. People create the crisis. And then - and I think this is the part that gets left out - we fix it.
It’s not clean. It has a cost. There’s likely significant suffering along the way. But humans, faced with a crisis of their own making, are - at the truly critical point - remarkably ingenious. Remarkably dedicated. We find a way through. And we emerge having changed something structural about the way we operate.
We don’t learn from history. We live it again, solve it again, in some new form.
What’s curious about today is that the “new form” is moving faster than any previous version. And we have, in extraordinary detail and in a hundred ways, already imagined what that might look like.
Science fiction has spent decades drawing us maps of all the way things might fail. Systems amplifying the worst of us. Institutions too rigid to adapt to change. Technology outpacing wisdom. We read all those stories. They were thrilling, cautionary, resonant.
Then we built the things anyway.
Not out of ignorance. That’s what makes it interesting. Out of something harder to name.
In 1967 a computer scientist, Melvin Conway, made an observation. Something simple, obvious in retrospect. The kind of thing that becomes a general “law”.
Any organization that designs a system (defined broadly) will produce a design whose structure is a copy of the organization's communication structure.
If your teams don’t talk to each other, your software won’t either. The architecture of what you build reflects the architecture of how you’re arranged.
And, while it’s usually expressed in the context of software development, it applies beyond it. Conway’s Law is a law about what complex systems reflect back about the structures that make them.
Social media. Platforms built by teams optimized for engagement. Inside companies measured on growth. Inside economies that treat growth as unqualified virtue.
The platforms didn’t fracture communities and amplify outrage through malice. They did so because the structures of the organizations that built them optimized for that result. The product reflects the organization. The organization reflects its incentives. The incentives reflect something deep - perhaps unexamined - about how we agree to measure progress.
That’s Conway’s Law at societal scale.
Climate change is the same pattern, older and slower. It’s not a failure of knowledge. The science has been clear for decades. But our economies and governments are built on an assumption that carbon extraction and prosperity are inseparable. That holds when individuals, even those inside the institutions, know better.
The system builds what the system is.
There’s a foundational assumption that shapes both of these. Growth is always good. More is always the right way to go. Progress is expansion. And it’s so deeply embedded in our institutional structures that it shapes the output of everything those institutions touch - whether we choose it or not.
AI is reflecting these patterns as well. I observe it in my professional world and in the broader cultural conversation.
Organizations are expending a great deal of energy around AI adoption. Companies, governments, schools, media. Mobilizing to “do AI.”
Because “do AI” is, at its core, a growth assumption.
The belief may not be stated so plainly, but it is structurally present. AI adoption leads to more. More efficiency. More output. More competitive advantage. More revenue. The same foundational logic is just being applied to a new capability.
But the mobilization is around an imperative that hasn’t been properly examined. What is “do AI?”. What are we actually organizing toward, does AI help us there, and does it get us there faster?
That gap is a Conway’s Law problem.
Any institution that isn’t clear what it’s optimizing for can’t give clear guidance for any system - human or artificial - to act coherently. And AI doesn’t resolve that ambiguity. It inherits it. It generates output - fast, confident, at potentially massive scale - that reflects and amplifies the institution’s unexamined assumptions back into the world.
It’s the fracturing logic of social media. But faster.
It’s the growth-at-all-costs assumption. Embedded in the models we use to accelerate.
Following the science-fiction maps we drew.
When people talk about the possibility of an AI crash - a real conversation with varying predictions - I think this underlies the anxiety. It’s not about a technical failure of the models. It’s that we don’t even know what we’re building toward. And technology this fast and this capable will surface that uncertainty in ways we’re not prepared for, and may not be able to contain.
Conway’s Law is a description, not an unavoidable imperative.
There’s a corollary - the Inverse Conway Maneuver. You can deliberately restructure an organization to produce different outputs. If you change how people communicate, change the underlying structural rewards, it will change what the system builds. It works. It’s been done.
Humans, more broadly, have changed our structures before. The ozone layer. Public health transformations like the creation of the UK’s NHS. Moments where crisis became legible enough, a desire for action concentrated enough, that the pressure to reorganize overcame the powerful inertia of existing structures.
We repeated history.
We created the crisis.
And then we fixed it. At cost, with difficulty, and emerged with something structurally different on the other side.
The question that doesn’t have an answer yet is what kind of crisis AI might be. Will it become something legible enough and concentrated enough to mobilize structural change? Or will it follow the social media pattern: diffuse and gradual? A slow accumulation of damage accumulating slowly enough that it embeds itself so deep in our structures that the change becomes infinitely more difficult.
I have faith in humans. I doubt institutions. About their ability to act before a crisis rather than inside it.
Maybe that’s the deal. Maybe it always will be.
We don’t learn. We repeat. We solve. We move on. And the question is whether we find the way through before or after the most difficult parts.
Further reading:
Kobetz, R. What The System Rewards. Defining Experience, Feb 2026.
Stoermer, T. Understanding Santayana’s Warning: The Price of Forgetting the Past. Tad Stoermer’s Resistance History, Nov 2025.
Conway’s Law. Wikipedia.
Article photo by Jessica Kantak Bailey on Unsplash.
