ZB ZB
Opinion
Live now
Start time
Playing for
End time
Listen live
Listen to NAME OF STATION
Up next
Listen live on
ZB

Universities give up using software to detect AI in students' work

Author
RNZ,
Publish Date
Tue, 30 Sept 2025, 2:33pm
The University of Auckland does not endorse AI-detection tools. Photo / John Weekes
The University of Auckland does not endorse AI-detection tools. Photo / John Weekes

Universities give up using software to detect AI in students' work

Author
RNZ,
Publish Date
Tue, 30 Sept 2025, 2:33pm

By John Gerritsen of RNZ

Several universities have given up using software to detect the use of artificial intelligence in student work.

Massey University made the decision recently, and the University of Auckland and Victoria University also did not use it.

For Massey, it followed a decision to stop using automated systems to monitor for cheating in online exams after a major tech failure last year.

It told RNZ detection was unreliable and it let students use AI responsibly in much of their work anyway.

One of the presidents of Massey’s Tertiary Education Union branch, Dr Angela Feekery, told RNZ academics had not used AI detection consistently.

Some used the results as a guideline but others would accuse students of cheating if the tool suggested their work consisted of more than a certain percentage of AI-generated content.

“There’s been a lot of research coming out basically saying that AI detection doesn’t work overly well. There’s a lot of tools that students can use to check if their work is going to be detected by AI and they can fool it anyway,” she said.

“Pretty much a decision’s been made to turn it off because it’s ineffective.”.

Feekery said there were other ways to spot AI use, such as checking a document’s version history to see if it was created in two minutes rather than over several days, or simply using professional judgment.

“I’ve been teaching for 25 years. I’ve been marking student writing for years. I know what it looks like, and it’s not what they are submitting now. In many of the cases, when you’ve got students who can write better than I can in first year, there is an issue.”

Feekery said academics were still trying to figure out the best ways to assess students in the age of AI.

“We don’t have the solution yet, but there’s certainly a lot of conversation around it and students are at the centre of those conversations. I can hand on heart say that the student experience is at the centre of the conversations we’re having around this,” she said.

University of Auckland graduate teaching assistant Java Grant was organising a conference on AI for the Tertiary Education Union next month.

He said Massey’s decision made sense from a technical standpoint.

“It’s really hard to differentiate what might be generated by an AI tool, unless there’s some telltale signs, something like ‘I can’t answer this because I’m a large language model’,” he said.

He said many academics were choosing to use forms of assessment where AI cannot be used.

Academics are choosing to use forms of assessment where AI cannot be used. Photo / 123rf
Academics are choosing to use forms of assessment where AI cannot be used. Photo / 123rf

“There is so much sensitivity around falsely accusing students of using the tools and so currently the best solution that we’ve found at the course level, with instructors and tutors working together to think about how we might make sure students are learning the content, we’ve personally gone to in-person, on-paper tests, but it’s increased the workload hugely.”

University of Auckland computer science senior lecturer Dr Ulrich Speidel said relying on student honesty for remote assessments was open to abuse.

“Absolutely nothing stops them from having a second device floating around or a friend or a helper. With exams like this I would probably look at, depending on the class and the background and the demographics of the class, I would look at probably between 30 and 60% of the class availing themselves to illicit help,” he said.

Speidel said the figure was based on his experience and on research.

However, he said automated monitoring of digital exams could be hacked, as could supposedly secure on-campus digital exams.

Speidel said Auckland debated the use of automated AI detection several years ago and decided it wasn’t worth it because it could not definitively prove whether a student had used AI for their work.

Massey University said its online assessments such as online essay submission or quizzes were not scrutinised.

“These are part of a wider assessment approach that ensures that students’ work is appropriately validated at key points in their study,” it said.

“The impact of Generative Artificial Intelligence [GenAI] means that all universities are reviewing their approaches to assessment. Like many others, Massey no longer uses AI detection as significant concerns have been raised about the reliability of the approach.

“Rather than using unreliable detection tools, the university is prioritising preventative measures such as secured assessments for those assessments where GenAI is not allowed. As part of this process, Massey is currently undertaking a process of consultation to determine future approaches to the delivery of examinations.”

Massey said students were permitted to use AI in all assessments, except those that could be secured in ways that prevented, rather than detected, AI use.

These included laboratory and studio-based activities, oral assessments and examinations.

“Turning away from detection does not mean we are simply delegating thinking, reasoning and rigorous academic practice to AI. Rather, it signals that we recognise our environment is shifting, and we must adapt accordingly.

“We are working to develop AI literacies across the university so we can effectively support students to use AI as part of their academic toolkit, ensuring they engage with it in ways that are ethical, learning-centred, and uphold academic integrity.”

Approaches to AI

How the eight universities approach online exam security and detection of AI in student work:

Auckland

  • Uses online invigilation for remote exams.
  • Does not endorse AI-detection tools.

AUT

  • Does not run remote, online examinations.
  • Unclear whether it uses AI detection software for student work.

Waikato

  • Conducts some exams online and some remotely.
  • Uses AI-writing detection tool.

Massey

  • Offers remote, online open-book assessments and tests without automated monitoring.
  • Does not use software to check for AI use in student work.

Canterbury

  • Uses monitoring tools for online assessments.

Lincoln

  • Uses videoconferencing technology to monitor remote online exams.
  • Uses software to check for AI use in student work.

Victoria

  • Seldom uses digital exams and does not use online proctoring.
  • Does not use AI detection.

Otago

  • Has very few digital exams.
  • Uses plagiarism detection software, but RNZ understands some academics do not use its AI detection function.

- RNZ

Take your Radio, Podcasts and Music with you