First, keep in mind that none of the new skills are especially new. Critical thinking? Collaboration? Communication? If you think these weren’t important for personal and professional success before the digital age, you’re nuts. I mean, some of the most wildly successful books of the past century (like How to Win Friends and Influence People, published in 1936, or The Power of Positive Thinking, published in 1952) covered precisely these skills and how to practice them. The so-called 21st century skills aren’t actually all that new.
Second, all the paeans to photocopiers and Google elide a simple truth: Students can’t think deeply about nothing. Skills are not a replacement for knowledge; they should be complementary. It’s tough to think critically or communicate incisively if you’re just “thinking about thinking” or “communicating about communicating” (or “learning how to learn” about conference calls). These skills are all worth developing, but only if there’s an objective. I mean, there’s nothing about studying “legacy” content—literature, history, math, science—that should get in the way of students learning empathy, collaboration, and problem-solving. Hell, these subjects are rife with opportunities to practice and master those skills.
So, then, how should we prepare students for the “age of AI”?
Here’s a hot take: Give students a robust, content-rich education. Make it rigorous and engaging. Teach reading, writing, math, literature, history, geography, science, world languages, and the arts. Teach Civil War battles, Euclidean geometry, dissection, the periodic table, and much else. Sure, cultivate useful skills. But job one for schools should be teaching a broad base of knowledge that will prepare students to be autonomous, thoughtful adults, no matter what the workforce actually looks like in 2046 (when today’s 4th graders turn 30).
Ultimately, the assertion that AI makes knowledge less valuable is more talking point than truism. As Ohio State’s Michael Clune aptly observed recently in The Atlantic, AI requires students to “analyze its written responses,” identify “inaccuracies,” “integrate new information with existing knowledge,” “envision new solutions,” “make unexpected connections,” and “judge when a novel concept is likely to be fruitful.”
Guess what? All those tasks depend on knowledge. You can’t identify inaccuracies, integrate new information, envision new solutions, make connections, or judge concepts absent baseline understanding. Clune quotes sociologist Gabriel Rossman, who notes, “Careful use of AI helps me at work, but that is because I completed my education decades ago and have been actively studying ever since.”
Leveraging AI’s vaunted capabilities requires deep, fluid knowledge. You want AI to help plan a manned mission to Mars? Great. You better know enough about orbital dynamics, mass-thrust ratios, material strength, atmospherics, and nutrition to ask the right questions. You want AI to help pen a country song? You’re well-served by being versed in lyrics, melody, editing, and cultural touchpoints.
Students have studied literature, history, languages, geography, geometry, and chemistry for centuries through all manner of innovations (including the steam engine, factory, airplane, transistor, and personal computer). Why? Because this is the corpus of knowledge that, when taught responsibly and well, helps students understand their humanity and their world. This is how schools prepare responsible citizens, productive adults, and autonomous human beings. Advances in technology, even one as staggering as AI, don’t change that. This is a timeless lesson—one we’re apparently obligated to learn time and again.
