AI Changes NOTHING About What Students Need to Learn

The disruptive technology of our age will change many things. What schools teach kids should not be one of them.

Photo illustration of a road with an arrow pointing to the horizon, with "NEW SKILLS" crossed out and "KEEP LEARNING" painted below that

What does AI mean for what schools should teach? The mantra of our age is “AI changes everything.”

I think that’s wrong. Profoundly wrong. In fact, when it comes to what schools should teach, it’s fairer to say, “AI changes nothing.”

Don’t misunderstand me. Obviously, AI is reshaping the economy, the workforce, and the production of 21st century staples like TikTok videos and country songs.

But AI can do its thing without necessarily changing what students should learn.

Photo of Rick Hess with text "Old School with Rick Hess"

We’re constantly told by a parade of tech bros, education impresarios, executives, and politicos that “the age of AI” demands less in the way of traditional academics and more focus on “soft skills” like “communication, problem-solving, and collaboration.”

The World Economic Forum’s explainer, “Why AI makes traditional education models obsolete – and what to do about it,” urges educators to ditch “specialized knowledge” and “embrace the ‘how to think’ model.” Harvard’s Howard Gardner predicts that, by 2050, children will need just a few years of “reading, ’riting, ’rithmetic, and a little bit of coding” because “most cognitive aspects of mind . . . will be done so well by large language machines and mechanisms that whether we do them as humans will be optional.” Instead of academics, OpenAI CEO Sam Altman advises students to pursue a “deep familiarity with the tools” and “sort of evolve yourself with technology.” Economist Tyler Cowen says “the curriculum itself is now radically obsolete” and filled with “wasteful instruction.”

TL;DR: Academic content is out; skills and learning “how to think” are in. If this all feels familiar, it should. While the tool may be new, the advice isn’t.

The purveyors of 21st Century Skills have spent decades insisting that it’s foolish for students to spend so much time learning academic content when there are so many more valuable things for students to learn. As Ken Kay, former president of the Partnership for 21st Century Skills, explained in “The Seven Steps to Becoming a 21st Century School or District,” what really matters are “the 4C’s”—“Critical thinking, Communication, Collaboration, [and] Creativity.”

Two decades ago, in a 2006 TIME Magazine cover story on “How to Build a Student for the 21st Century,” Deborah Stipek, the then-dean of Stanford University’s esteemed Graduate School of Education, mocked the idea that students should still learn South American geography, Civil War battles, or the periodic table of elements. Why? As she put it, “You can look it up on Google.”

Harvard’s Tony Wagner has gone even further, declaring, “Knowledge has become a commodity. Free like air, like water . . . available on every internet connected device. There is no longer a competitive advantage in knowing more than the person next to you because they’re going to Google it and figure it out just in time.”

In a new book on reviving the liberal arts, Angela Bauer, the provost at Texas Woman’s University, urges a “seismic shift” in which colleges “transition from a content-based curriculum” and “knowledge-based tests” to “experiential learning theory.” Though she offers all manner of “life skill” alternatives to content-based learning (ranging from ethics to teamwork to cultural competence), it’s never quite clear what—if anything—Bauer thinks today’s graduates should actually know.

That’s a common malady. Indeed, those intent on demoting academic content never offer much more than vacuous, hand-waving incantations as to what students should learn instead. I can’t help but think of the century-old Cardinal Principles of Secondary Education, issued back in 1918 by the National Education Association, which tagged academics as just one of seven priorities for schools—while elevating more “modern” pursuits like “health,” “worthy home-membership,” and the “worthy use of leisure.” That exercise was prompted by early 20th-century industrialization: The commission of experts wrestled with what students needed to know in an era of factory production and world-changing technologies like cars, planes, and radios. They concluded it was less literature and more life skills.

Sam Altman shakes hands with Lee Jae-yong
Sam Altman, CEO of OpenAI, is likely right that artificial intelligence is and will continue to reshape business and industry like Korean electronics company Samsung (whose executive chairman Lee Jae-yong he’s shaking hands with here). But he’s almost certainly wrong about the technology’s transformative effects on learning.

While each subsequent technological era has yielded similar exhortations, the calls took on newfound fervor in the digital age. Back in 1989, former Assistant Secretary of Labor Arnold Packer, a Johns Hopkins scholar and co-author of the hugely influential “Workforce 2000” report, took to the Washington Post to argue it was silly for students to still “cut up frogs” in biology when “workers need to know digital technology.”

Instead of anatomy, what did Packer want students to learn? He raved about a program whose students “use computers and videodiscs to learn about photocopiers and fax machines and about telephones that are used in complex conference calls.” Of course, he wrote, the goal wasn’t simply “learning how to operate a fax machine” but to master “the higher skills that will enable [students] to operate tomorrow’s office equipment; in other words, they are learning to learn.” It’s never quite clear why he thought using a fax machine helps students “learn how to learn” but dissection does not.

Look, I have no longitudinal data here, but I strongly suspect that graduates who were literate, numerate, and modestly knowledgeable about science fared better over the past 35 years than those with even a dazzling mastery of photocopiers and faxes.

I don’t mean to pick on Packer. He’s had lots of company over the years. In 2000, the federal 21st Century Workforce Commission published “A Nation of Opportunity: Strategies for Building America’s 21st Century Workforce,” which identified the three “hottest” jobs in IT: Technical Support, Database Administration, and Computer Programming. Umm, whoops. Those “hot” 21st century jobs weren’t ultimately so hot. In computer programming, the number of positions plunged 60 percent between 2001 and 2019—and that’s before AI started to wreck house. And, in a bit of unfortunate timing, the Commission’s report was immediately followed by a wave of offshoring that gutted U.S.-based technical support. Today, tech support pays a bit under $30 an hour, and not many would consider it a growing field. Turns out it’s hard to predict the shape of the future workforce or the skills graduates will need a decade or two hence.

But workforce projections are cool and, since it takes decades to see if the prognosticators are right, no one’s ever held accountable for being wrong. Meanwhile, talk of a “new workforce” is an excuse to dream up fun lists of nifty “new skills,” which is more appealing than struggling to do better at teaching the old, boring ones. “New skills” make for exciting grant applications, buzzy headlines, and inspiring keynotes. Trying to help kids master geometry, geography, or biology? Not so much.

A quarter-century ago, Marc Prensky, the man who coined the term “digital native,” explained, “The single biggest problem facing education today is that our Digital Immigrant instructors . . . are struggling to teach a population that speaks an entirely new language.” You see, wrote Prensky:

[A]fter the digital “singularity” there are now two kinds of content: “Legacy” content (to borrow the computer term for old systems) and “Future” content. “Legacy” content includes reading, writing, arithmetic, logical thinking, understanding the writings and ideas of the past, etc – all of our “traditional” curriculum . . . Some of it (such as logical thinking) will continue to be important, but some (perhaps like Euclidean geometry) will become less so, as did Latin and Greek. “Future” content is to a large extent, not surprisingly, digital and technological.

In practice, making room for “future” content means “legacy” academics get demoted. Expectations decline, rigor gets dismissed as outdated, and the focus drifts from knowledge-rich instruction. This is why the proponents of “new skills” have consistently short-changed students, and why they’re at risk of doing so once again in the age of AI.


Subscribe to Old School with Rick Hess

Get the latest from Rick, delivered straight to your inbox.


First, keep in mind that none of the new skills are especially new. Critical thinking? Collaboration? Communication?  If you think these weren’t important for personal and professional success before the digital age, you’re nuts. I mean, some of the most wildly successful books of the past century (like How to Win Friends and Influence People, published in 1936, or The Power of Positive Thinking, published in 1952) covered precisely these skills and how to practice them. The so-called 21st century skills aren’t actually all that new.

Second, all the paeans to photocopiers and Google elide a simple truth: Students can’t think deeply about nothing. Skills are not a replacement for knowledge; they should be complementary. It’s tough to think critically or communicate incisively if you’re just “thinking about thinking” or “communicating about communicating” (or “learning how to learn” about conference calls). These skills are all worth developing, but only if there’s an objective. I mean, there’s nothing about studying “legacy” content—literature, history, math, science—that should get in the way of students learning empathy, collaboration, and problem-solving. Hell, these subjects are rife with opportunities to practice and master those skills.

So, then, how should we prepare students for the “age of AI”?

Here’s a hot take: Give students a robust, content-rich education. Make it rigorous and engaging. Teach reading, writing, math, literature, history, geography, science, world languages, and the arts. Teach Civil War battles, Euclidean geometry, dissection, the periodic table, and much else. Sure, cultivate useful skills. But job one for schools should be teaching a broad base of knowledge that will prepare students to be autonomous, thoughtful adults, no matter what the workforce actually looks like in 2046 (when today’s 4th graders turn 30).

Ultimately, the assertion that AI makes knowledge less valuable is more talking point than truism. As Ohio State’s Michael Clune aptly observed recently in The Atlantic, AI requires students to “analyze its written responses,” identify “inaccuracies,” “integrate new information with existing knowledge,” “envision new solutions,” “make unexpected connections,” and “judge when a novel concept is likely to be fruitful.”

Guess what? All those tasks depend on knowledge. You can’t identify inaccuracies, integrate new information, envision new solutions, make connections, or judge concepts absent baseline understanding. Clune quotes sociologist Gabriel Rossman, who notes, “Careful use of AI helps me at work, but that is because I completed my education decades ago and have been actively studying ever since.”

Leveraging AI’s vaunted capabilities requires deep, fluid knowledge. You want AI to help plan a manned mission to Mars? Great. You better know enough about orbital dynamics, mass-thrust ratios, material strength, atmospherics, and nutrition to ask the right questions. You want AI to help pen a country song? You’re well-served by being versed in lyrics, melody, editing, and cultural touchpoints.

Students have studied literature, history, languages, geography, geometry, and chemistry for centuries through all manner of innovations (including the steam engine, factory, airplane, transistor, and personal computer). Why? Because this is the corpus of knowledge that, when taught responsibly and well, helps students understand their humanity and their world. This is how schools prepare responsible citizens, productive adults, and autonomous human beings. Advances in technology, even one as staggering as AI, don’t change that. This is a timeless lesson—one we’re apparently obligated to learn time and again.

Frederick Hess is an executive editor of Education Next and the author of the blog “Old School with Rick Hess.”

Last Updated

NEWSLETTER

Notify Me When Education Next

Posts a Big Story

Program on Education Policy and Governance
Harvard Kennedy School
79 JFK Street, Cambridge, MA 02138
Phone (617) 496-5488
Email Education_Next@hks.harvard.edu

Copyright © 2025 President & Fellows of Harvard College