Students Challenge AI-Driven Teaching at University of Staffordshire
Learners claim a computing course was largely delivered via AI-generated slides and narration, sparking integrity and learning quality concerns
Students at the University of Staffordshire have voiced serious concern after discovering that a government-funded coding course was delivered in large part by artificial intelligence, including AI-generated presentation slides narrated by a synthetic voice.
The group of forty-one learners, enrolled on a coding apprenticeship module in cybersecurity and software engineering, told university staff they felt “robbed of knowledge and enjoyment” when the course turned out to rely on generic, error-strewn materials rather than engaged human teaching.
One student confronted a lecturer during a recorded session, stating: “If we handed in stuff that was AI-generated we would be kicked out… but we’re being taught by an AI.” The Guardian reviewed the course materials and used AI-detection tools which indicated a “very high likelihood” that substantial parts of the teaching content were AI-generated.
According to the students, the signs were immediate: inconsistent British and American English, suspicious file names, voice-over accents shifting mid-presentation and trivial or irrelevant content referring to U.S. legislation.
The university has defended the course by saying that “academic standards and learning outcomes were maintained” and that AI tools may support teaching, but cannot replace scholarly expertise.
Its published policy states that students are prohibited from using AI to outsource their work — even while a new framework permits academic staff to leverage digital automation in their delivery.
One student described the situation as feeling “like a bit of my life was stolen” given that he had committed to a career change and believed he would receive meaningful instruction rather than an automated one.
The incident highlights a growing tension within higher education over the role of AI in teaching.
While artificial intelligence tools are increasingly used to generate content and assist educators, independent research suggests AI-prepared materials often fall short of promoting deep learning or critical thinking.
For students who expect human-led instruction, the replacement of lecturers with automated content raises questions of fairness, transparency and value.
At the University of Staffordshire, despite the final session being delivered by a non-AI lecturer, the earlier materials remain in use and several students continue to seek remedial support.
The case may prompt wider examination of how universities balance efficiency with educational integrity in the age of artificial intelligence.