What my parents got right
The AI race won't be decided by chips or models. It might be decided by philosophy classes.
My parents never told me what to study. They encouraged me to learn. To follow the things that made me curious, ask questions, and study what I had a passion for. The skills, career, the shape of my working life - these would all follow. It was something they were so confident in that it barely needed to be stated. I have a degree in Politics, and I’ve worked in technology for most of my life.
I was lucky. Not everyone got that advice. Or had the privilege to be able to take it. For a generation of students who came of age after 2008, there was immense pressure in the opposite direction. It felt, at the time, like responsible advice.
The bait
The financial crisis didn’t just destroy jobs. It destroyed confidence in our future. And when the confidence collapsed, pragmatism filled the void. There was a consistent message that emerged from the wreckage.
Be practical.
Choose a degree that leads somewhere. Don’t study philosophy, history, or literature. Those are self-indulgent luxuries. Choose STEM. Get some credentials that will pay off. It’s not a question of what you want to learn, it’s about what the market will reward.
It wasn’t a cruel argument. More like the responsible one. Tuition costs were rising - often catastrophically fast - and the job market for new graduates was brutal. What kind of parent watches their child take on serious amounts of debt while encouraging them to study the humanities? What teacher points a student toward a philosophy degree when it’s engineering graduates who get hired? The logic was real.
Perhaps through the best intentions, a generation has been steered away from exactly the kind of education that teaches you how to think. They’ve been pushed toward the kind that teaches you what to do. Being curious isn’t as important as being certified. Exploratory and humanistic study was deprioritized. And this tendency has stuck. Numbers studying humanities have kept dropping, even as the economy recovered.
The cohort most affected is now in their late twenties and early thirties. They carry student debt that they were told was an investment. An investment for right now, the time in their careers that should be most productive. They hold the credentials they were told would protect them.
And they are about to discover the full weight of what’s been building quietly for the last few years.
The switch
AI arrived, and reversed the skills map.
There’s an observation from a widely shared essay that Matt Shumer, CEO of Otherside AI, wrote for the non-technical people in his life.
“The people most likely to thrive are the ones who are deeply curious, adaptable, and effective at using AI to do the things they actually care about.”
That’s not a description of someone with a vocational degree. It’s a description of someone who got a liberal arts education.
AI is an amplifier for people who know how to think. People who can construct a coherent argument, or recognize when one is weak. People who read critically, are comfortable with ambiguity. People who will change their mind when evidence shifts, and ask sharper questions on the next go around.
Writing a good prompt for an LLM isn’t simply a technical skill - it’s also rhetoric. AI rewards the ability to clearly articulate what you want, to think through problems from multiple angles, and to refine an idea through iteration. Those are skills you might build while arguing in a philosophy tutorial, writing a history thesis, or through a close reading of a difficult novel.
And the “practical” skills we pushed a generation toward are exactly what’s being automated fastest. They’re procedural, technical, and executable. Previous waves of automation displaced specific categories of work. People could retrain and fill employment gaps elsewhere. AI is different. It’s not automating one category, and leaving others. It’s more general, and it can improve on everything simultaneously. Whatever you retrain for, it’s getting better at that, too.
It’s a bitter irony for that post-2008 cohort. They took on serious debt in order to be practical. To be hireable. They pursued an education based on specific skills and credentials the market valued. It’s left them exposed to the very disruption they were trying to guard against.
They can’t go back and choose differently. The debt is real. Their degree is real. The skills that might have served them best were the ones they were firmly talked out of.
An unexpected generation
Older workers are often seen as struggling to keep up. They’re not tech-savvy. They’re too fixed in their ways. That can harden into something uglier - age discrimination hidden by terms like “culture fit”. Younger workers are fluent in the tools of the current moment, so they’re better positioned for a technology-defined future.
This time, we might have that backwards.
Workers in their fifties and sixties - quietly screened out of interview processes - were educated before the intense vocational focus. They studied more broadly. They have decades of domain expertise, and the judgment that comes from the lessons of a long career. They’ve developed comfort with ambiguity. Perhaps through disposition. As likely through long experience of managing genuinely uncertain situations.
Those aren’t soft skills.
In an AI-augmented environment, they are the skills that we need to use a powerful tool well.
This isn’t nostalgia. It isn’t an argument against younger workers, many of whom - of course - have these capacities regardless of their formal education. But the generation most written off by the technology industry may have, in this moment, the right skills to make AI more powerful. Curiosity. Critical distance. The ability to direct and interrogate, not simply execute.
Optimizing a generation for execution over understanding wasn’t only an error in career advice. It’s a cultural one, too. Nations are competing to lead an AI-defined century, and the same logic applies with even higher stakes.
The race we’re actually running
Global AI competition is a hardware story. It’s about chips. Compute clusters. Which country has the most advanced infrastructure? Which research lab will make the next big leap forward? Should Nvidia sell intellectual property to Dubai? Does Taiwan’s chip industry make it more critical for the US to defend?
The story is one of a race between technology industries, and national investment budgets.
But what if the real bottleneck isn’t the tool?
What if it’s having enough people who can use it well?
AI is as powerful as the quality of thought that directs it. A system that can synthesize, analyze, construct an argument, and generate new approaches is most valuable in the hands of people who can do that too. At a national scale, the country that navigates the AI transition most successfully might not be the one with the most advanced models. It might be the one with the most broadly educated, critically thinking population.
That’s a political issue.
Some governments are investing in the kind of education that develops these capabilities. Others are doing the opposite. Cutting university funding and attacking institutions associated with critical inquiry. Framing the teaching of analytical complexity as something that’s ideologically suspect. In the United States, there is policy-level hostility to exactly the kind of education we need to defend. Whatever the political logic, the strategic consequences for AI readiness aren’t difficult to trace.
China is a different but related challenge. Its graduates have technical depth, and elite institutions develop genuine analytical rigor. But critical thinking exercised inside an ideologically constrained society is different from critical thinking freely applied. Productive AI collaboration requires the ability to question assumptions, or pursue genuinely unwelcome conclusions. Those might not flourish in societies where some questions are off-limits. Having the skills and being permitted to deploy them are two different things.
Societies investing in curiosity, breadth, the freedom to think carefully and openly, may be building the most durable AI advantage. Societies dismantling those capabilities - in the name of practicality, cost efficiency, or ideological conformity - may be making a strategic error of historic proportions. Ironic, when they’re doing it in the name of being competitive.
A recoverable capacity
Curiosity isn’t a fixed trait. Critical thinking is something you practice. The habits of the mind that a broad education develops can be built in other ways. They atrophy without use. They can also be strengthened through exercise.
AI could even be part of that recovery. Not as a replacement for education, but to encourage some of the habits of mind that make education valuable. Engaging with these tools to do more than just automate tasks. Using them as thinking partners to bounce ideas off. That’s a form of practice in itself. Technology that seems to be closing doors may also, quietly, be opening one.
My parents didn’t give me a career strategy. They encouraged a disposition. A belief that understanding things deeply is worth doing for reasons that don’t need a justification in their market value. A love of learning for its own sake. That it’s also turning out to matter enormously in this economic moment is a happy coincidence. It always had value.
The philosophy class, the history seminar, the literature essay finished off at 2am with a conclusion you weren’t entirely happy with. Those aren’t luxuries.
Not everyone was that lucky. A generation was told, through care and in completely understandable circumstances, that a love of learning for its own sake wasn’t worth the money.
It was worth it all along. Recognizing that - for individuals, and for societies trying to understand what kind of AI transition to aim for - it may be the most urgent thing.
Further reading:
Graham Burnett, D. Will The Humanities Survive Artificial Intelligence? The New Yorker, Apr 2025.
Diedrich, G. Essential Intelligence: Why The Age Of AI Still Needs The Humanities. Forbes, Aug 2025.
Speri, A. ‘I wish I could push ChatGPT off a cliff’: professors scramble to save critical thinking in an age of AI. The Guardian, Mar 2026.
Article photo by Elijah Crouch on Unsplash.
