AI isn’t undermining university, it’s exposing it
The crisis in universities didn’t start with ChatGPT, it began when the acquisition of knowledge became a transaction. To tackle it, universities should rediscover their foundational purpose: intellectual innovation, says Eliza Filby
Universities are at a tipping point with AI. One recent survey found 88 per cent of students have used Gen AI for written work – up from 53 per cent last year. There’s growing resentment among students who aren’t using it but receive lower marks than those who do. Universities are scrambling to respond, drafting guidelines, while some parents are hiring lawyers to contest students caught up in AI misconduct procedures. Tutors have become AI detectives, trying to decipher guilty students that outsmart detection software.
“First we had to be mental health professionals, now we’re supposed to be LLM experts,” one junior academic told me. “When do we actually get to teach?” Older professors often don’t realise how serious the issue is, while younger staff, often on precarious short-term contracts, are too time-poor and powerless to push back. Many are even quietly using AI in the marking process just to keep up.
This didn’t start with ChatGPT. Over a decade ago, students were already submitting downloaded essays. When I was an academic, I remember being asked to quietly pass an international student paying big sums but whose English was so poor it was clear they hadn’t written the paper they submitted.
This current crisis reflects the long-term shift in the student-university relationship since tuition fees were introduced. Once knowledge became transactional, students began treating degrees like commodities. With rising costs came pressure to secure top marks – by any means. Knowledge for its own sake has been replaced by credentialism.
The student as customer
The shift from student to customer has been gradual but real. I remember staff meeting discussions devoted to how we could boost our student survey rankings. What worked better — free beer or pizza? And as customer satisfaction became king, so did grade inflation. In 2018, 30 per cent of students were awarded a first-class degree. Back in 2011, it was just 16 per cent. Professors found themselves confused not just by their students’ attitude, but by their declining authority. Where had deference gone?
Then came Covid. Reduced contact hours, strikes, rising debt and a more challenging job market drove growing dissatisfaction. One Higher Education Policy Institute (HEPI) survey found that 69 per cent of students still had remote lectures in 2024. Now graduates face an AI-driven job market with a falling degree premium. We’ve left a generation underprepared for the future and overcharged for what’s past.
So, what next? Educators need to stop thinking about AI as just a threat and start recognising it as the reality of knowledge acquisition in the 2020s. Assume everyone is using it.
Secondly, AI is a starting point, not the end. Some lecturers are rightly encouraging students to interrogate it: What did it miss? What biases appear? How could better human input improve its output? This is the critical thinking universities claim to teach.
Finally, we also need to recognise that outdated formats – the 2,000-word essay or dissertation are no longer fit for purpose. Why not revive oral exams, the standard before 19th-century paper tests? In a world of automation, unleashing human spontaneity is key. I work with companies every day, and they tell me the same thing: they want people who can talk, think, listen and persuade – skills that are increasingly lacking from a digital driven existence. We must not allow these oral and rhetorical skills to become the preserve of those who attended private schools, where debating clubs, performance and public speaking are part of the cultural fabric. These abilities shouldn’t be a privilege; they should be an educational standard.
PhDs are still orally defended. I vividly remember my viva – justifying every word before two experts. No AI could prepare me for that. It gave me a voice. And what’s more powerful in education than that?
I vividly remember my viva – justifying every word before two experts. No AI could prepare me for that. It gave me a voice. And what’s more powerful in education than that?
One of the unremarked aspects in the AI debate is how we overfocus on how intelligent the models are. But what if the real revelation may not be the brilliance of machines, but the banality of most of what humans do. This is our opportunity to question the very foundations of how we acquire, assess, teach and democratise knowledge. These are issues that universities, historically at the epicentre of intellectual innovation, have always tackled. They must do so again.
If universities embraced such an approach, they wouldn’t just improve the quality of education; they’d prepare students for a job market driven increasingly by AI-knowledge rather than university accreditation.
And maybe, just maybe, they’d start to repair the deep sense of disillusionment that now haunts the system.
Eliza Filby is the author of Inheritocracy: The Bank of Mum and Dad