Subscribe for new posts

Join 1,000 Curious Minds. Get sharp essays and explainers that cut through the noise — for the curious, sceptical, and free-thinking. No spam. No tribes. Just thinking that holds up.

Subscribe Veridaze cover image
Xander Veridaze profile image Xander Veridaze

The illusion of learning: how ChatGPT exposed the hollow core of higher education

Students aren’t cheating — they’re cooperating with a system that stopped valuing learning long ago. AI didn’t disrupt education. It revealed the emptiness at its core.

The illusion of learning: how ChatGPT exposed the hollow core of higher education
Photo by Aerps.com / Unsplash

There’s a strange irony in watching professors panic about ChatGPT. The real scandal isn’t that students are cheating. It’s that no one seems to notice they’ve stopped learning.

Cheating presumes there’s a correct process being subverted—a shared understanding of what learning should look like. But what if the process itself was always hollow? What if the problem isn’t that students are bypassing the system, but that the system is designed in a way that makes bypassing it indistinguishable from success?

ChatGPT didn’t destroy higher education. It revealed just how much of it had already become a ritual of paperwork. Assignments are completed, boxes are ticked, metrics are met—but at the end of it, has anything meaningful actually taken place? If a student can generate a passable essay in sixty seconds with a prompt and a chatbot, perhaps the scandal isn’t that they can cheat, but that it doesn’t matter.

Education, at its best, is supposed to be transformative. It's meant to change not just what we know, but how we think—how we reason, interpret, question, and imagine. But this is slow, difficult work. It can’t be rushed. It can’t be faked. And it certainly can’t be fully captured by outputs that can be gamed with the right tools. What AI exposes is that much of what we called 'learning' had already been reduced to signal processing—producing the right shape of answer in the right format at the right time.

CTA Image

Think better. Get the FREE guide.
Join Veridaze and get 10 Mental Tools for Clearer Thinking — a free guide to cutting through noise, confusion, and nonsense.

Subscribe now

For years, universities have sold a dream of intellectual growth while often delivering bureaucratised credentialism. The lecture becomes a slideshow. The seminar becomes a box to tick. The essay becomes a performance. And now, the performance can be outsourced. AI is not cheating the system. It is cooperating with it—too efficiently. It is doing what the system rewards, only faster and with fewer illusions about depth.

The humanities are hit hardest. Philosophy, literature, history—once the bedrock of a reflective education—are increasingly treated like antiquated hobbies. Basket-weaving with footnotes. In a world obsessed with utility, their value is hard to quantify. So students play the game. They outsource the boring parts, skip the unreadable texts, and let the machine stitch together plausible reflections on ideas they’ve never actually engaged with. It works. That’s the problem.

Some argue this is a failure of ethics. But what if it’s a failure of structure? Of design? If the goal is a degree, not understanding, then ChatGPT is the perfect tool. It optimises for results. So do the students. And the institutions, frankly. The metrics don’t track insight. They track throughput. Completion. Satisfaction. Employability. We trained the system to care more about the certificate than the substance. The result is predictable.

And yet, the deeper crisis is not institutional but philosophical. We used to think of education as a moral good. A way of cultivating judgment, humility, and discernment. But somewhere along the way, it became a transaction. The classroom turned into a marketplace. The student became a consumer. And now the consumer has found a cheaper supplier.

Tools like ChatGPT are not inherently corrosive. They are mirrors. They reflect the priorities of the culture that adopts them. If students use AI to write essays, it’s worth asking: what did we teach them to value? What kind of learning were we modelling? Were we cultivating inquiry, or compliance? Curiosity, or conformity?

It’s not hard to see where this goes. We will soon have generations of graduates who can navigate systems, mimic competence, and automate credibility—but who struggle with unstructured thought. Who can speak fluently in citations and jargon, but are uneasy with ambiguity. Who expect knowledge to be frictionless, and forget that thinking is supposed to be hard.

The panic over AI is not a technological crisis. It is an epistemological one. A slow and quiet unravelling of meaning. The kind of crisis that doesn’t scream until it’s too late. And when it does, we’ll realise the machines didn’t kill learning.

We did. ChatGPT just showed us the corpse.

Further reading

Excellent Sheep by William Deresiewicz
A damning look at how elite education produces conformists rather than thinkers — and why intellectual courage is quietly disappearing from campus.

The Coddling of the American Mind by Greg Lukianoff & Jonathan Haidt
An essential read on how emotional safety culture and institutional fragility have undermined resilience, curiosity, and honest discourse.

The Shallows: What the Internet Is Doing to Our Brains by Nicholas Carr
A sobering account of how digital tools rewire our cognition — trading depth and reflection for speed and distraction.

Weapons of Math Destruction by Cathy O’Neil
How algorithmic systems reward superficial metrics, amplify injustice, and erode the idea of merit — especially in education.

Shop Class as Soulcraft by Matthew B. Crawford
A philosophical defence of manual competence and the slow, embodied forms of knowing that academia now treats as obsolete.


If you found this useful, consider subscribing for more thought-provoking articles. And feel free to share your take in the comments below.

Join the conversation

You might also like:

Epistemic humility: the courage to say ‘I don’t know’
Epistemic humility is the underrated skill of our age — the ability to question what we know, embrace uncertainty, and think with clarity in a culture obsessed with certainty. Here’s why it matters now more than ever.
Artificial superintelligence (ASI) is coming — and no one’s ready
Artificial superintelligence isn’t science fiction anymore — it’s a looming reality. And unlike previous technological shifts, this one won’t just change how we work. It may redefine what it means to be human.
Are mind viruses controlling what you think?
Ideas can spread like viruses—bypassing reason, hijacking emotions, and reshaping how we think. This essay explores the psychology, history, and digital dynamics behind today’s most infectious beliefs—and how to resist them.
Scroll, click, repeat: how tech fragmented our minds
Our digital lives are no longer just convenient—they’re curated, persuasive, and increasingly invasive. What does it mean to think freely in an age of engineered distraction?
Xander Veridaze profile image Xander Veridaze
Writer and editor. Exploring politics, philosophy, media and culture with short essays that cut through noise, question orthodoxy, and invite sharper, more independent thinking.