Design systems are over. Product context is the work.
Design systems aren't obsolete - but their scope no longer matches the work.
Design systems are over - at least as far as we’ve learned to define them.
And the name has always been kinda iffy. It undersells the work involved, and muddies expectations about what the output is. The foundations are still essential, but we’re wrapping them in a scope that doesn’t match their role - especially as AI reshapes how we build products.
The containing terminology we’ve used to describe our work no longer fits what the work has become.
It’s always been a bad name
It’s been a fairly consistent undercurrent in conversations I’ve had for years. “Design systems” is not a great term.
We place “design” front and center, even though the primary consumers are often engineers. It implies visual polish rather than deep production infrastructure. It suggests a static system, not a set of evolving, critical decisions. And that doesn’t reflect all the complex labor involved - and the value delivered.
Teams have learned to work around this. Translate and explain in order to justify.
We lived with it when the system’s job was primarily standardizing UI. It’s increasingly becoming a liability when the outputs aren’t mediated solely by humans. Less exhaustive reviews, a partnership with AI quality assurance.
Design systems solved yesterday’s scaling problem
Design systems emerged to solve very real issues:
Fragmented interfaces
Repeated implementation work
Fractured user experiences with inconsistent behavior
Design and engineering drifting apart
We provided a shared foundation: components, tokens, patterns, documentation. We built upon (or created) a shared brand language that teams could rely on as their organizations scaled.
That work still matters - more than ever. But the environment those systems operate in is changing, and our definitions aren’t keeping up.
AI doesn’t consume components - it consumes context
We’re starting to talk about AI using design systems as inputs: feed the model components, tokens, guidelines, and generate output.
That’s incomplete framing.
AI doesn’t effectively recognize and implement components in isolation. What AI consumes - and amplifies - is context.
Which decisions are encouraged, and which are merely permitted
Where the system is strict and where flexibility is encouraged
How to handle accessibility tradeoffs
Which interaction patterns are preferred - and why
Tone and voice changes in different moments
Understand historical exceptions and justifications
This isn’t context that is cleanly surfaced in a component library. It lives all around it - in our documentation, in related guidelines, Slack discussions, and decisions that we forgot to codify.
People are better at surviving ambiguity than machines, so it was survivable when we were the only bottleneck.
But now we have machines producing at scale, that doesn’t work any more.
The risk of accelerated drift
If we don’t have strong product context, AI creates divergence rather than coherence.
Prompts are local decisions
Outputs are reasonable in isolation
Product drifts at scale quickly move from subtle to structural
We can kind of identify this through instinct. AI-generated UI feels superficially correct, but we can sense that wrongness. AI follows the easily visible rules and misses the invisible constraints.
Design systems aren’t about how things look, they’re about how we propagate our decisions.
Product context is broader than we’ve allowed systems to be
If a design system is the central foundation, product context is the structure that’s built upon and around it.
Product context includes:
Visual and technical foundations (tokens, components, layouts)
Interaction models and behavioral patterns
Our content principles, tone and voice, language constraints
Accessibility decisions and requirements
Governance and review expectations
Regulatory boundaries and corporate risk tolerance
Historical precedent - especially about why exceptions exist
This context is still usually pretty fragmented. Owned by different teams. Sometimes something that’s a universal reference external to the company. Uneven documentation, and enforced socially.
AI reduces the margin for error that that fragmentation brings.
A role shift not a repudiation
So this is where the work begins to change.
In AI digital delivery pipelines, the most valuable contribution isn’t another component (I’d argue that’s been the case even before AI acceleration!). It’s making explicit context implicit and operational.
And that reframes the roles and responsibilities of design system teams.
Maintain intent, not artifacts
Define boundaries, not enforce consistency
Usable, machine-readable context, as much as human documentation
It’s less about expanding control as it is about expanding clarity.
AI needs better constraints, not more pixels.
The scope failed, not the name
My title is sharp, because I want to drive to a pragmatic conclusion.
Design systems aren’t obsolete. They’re foundational. But you create a foundation to support something larger.
We can’t continue to conflate design systems as component libraries. If we do, we’ll underinvest in the context that AI needs to strengthen our products. Instead, it might erode them.
Our work has grown. Our responsibility has expanded.
The opportunity is bigger than the name we’ve been using.
Product context is the work.
Further reading:
Opperman, L. Design Systems vs. AI: will the robots take over? Medium, Jan 2024.
Teich, D. “The Alignment Problem”, Linking Machine Learning And Human Values Forbes, Oct 2020.
