More than 500 students in higher education institutions were found to have used Artificial Intelligence (AI) illegally in their graded coursework last year.
RTÉ's This Week programme asked every university how many students it had caught using artificial intelligence in an unauthorised manner in the academic year 2024-2025.
However, many of the country's largest universities, including UCD, UCC, Maynooth and UL do not distinguish between unauthorised use of AI and other forms of plagiarism.
Meaning the figure using AI to cheat is likely significantly higher.
The Higher Education Authority does not require institutions to record unauthorised uses of AI by students in graded coursework separately to other forms of cheating, however it points out that this does not alter the responsibilities of universities to ensure academic integrity.
University of Galway's Academic Integrity Officer Dr Justin Tonra said that they were seeing an increase in the amount of cheating because of generative AI.
We need your consent to load this rte-player contentWe use rte-player to manage extra content that can set cookies on your device and collect data about your activity. Please review their details and accept them to load the content.Manage Preferences
"The reality is that the introduction of generative AI has changed the landscape of higher education," he said.
"We are seeing an increase in the amount of cheating that is facilitated or enabled by that technology.
"That is a challenge to us and it's one that we can't shy away from," he added.
The University of Galway detected 224 cases of unauthorised AI in graded coursework last year.
"Putting it in the context of the 20,000 students in the university, that number is not terribly high," Dr Tonra said.
"But we do also recognise that is likely underreported because any of the research in academic integrity shows that academic misconduct is generally underreported," he said.
Dr Tonra said that the figure does represent an increase in cases of generative AI misconduct in University of Galway compared to the previous year.
According to the National Academic Integrity Network, AI detectors are not recommended for use in detecting generative AI in students' work and may lead to false positives.
The AI Advisory Council agrees that detection methods do not work.
TU Dublin found 71 students used AI and the National College of Ireland found 68 students had misused AI in an assessment.
Trinity College Dublin had asked for an extension, but has not yet responded to a freedom of information request.
Among institutions that did respond, the number of unauthorised AI uses was relatively small compared to the entire student body.
For example, just over 5,000 students completed courses in Dundalk IT last year and 43 were found to have used AI inappropriately.
The Royal College of Surgeons has a similar sized student body and 36 of its students were found to have used AI illegally in an assessment.
Students who were caught using AI in an unauthorised way seemed to be dealt with on a case-by-case basis, depending on the college they were studying under and the type of assessment, essay, exam, presentation, etc.
Across its faculties, TU Dublin found cases of AI misuse last year, but the consequences depended on the faculty and the type of assessment.
In Mary Immaculate College, students received an F grade in coursework that was confirmed to have used AI illegally but were permitted to do resits, but their grade was capped.
In St Patrick's Carlow College, 46 students were found to have used AI in graded material in the academic year 2024-2025.
The majority faced a grade penalty as a result, although other consequences recorded included failing the assignment and resubmitting with marks capped.
Students at The National College of Ireland who were repeatedly found to have used AI risked suspension for the remainder of the academic year.
For the bigger universities that did not hold centralised records on AI misuse, many said that each school had discretion to deal with plagiarism cases as it thought was necessary.
In a statement to This Week, the Department of Higher Education said that there are bodies in place that support institutions in "addressing the challenges of academic misconduct including those exacerbated by AI tools".
When students misuse AI
The University of Galway relies on lecturers being subject experts to detect suspected cases of AI followed by a one-on-one conversation with the student in question.
Although this is labour-intensive work and Dr Tonra said the alternative is "not really conceivable".
"Academic integrity is so important; it is worth the investment of time and labour," he said.
Students suspected of using AI to cheat are also told how they can avoid academic misconduct in the future.
Dr Tonra said the low amounts of repeated misconduct cases in the university show that this method is more effective than a more punitive approach.
He said the University of Galway wants to detect cases of AI misuse to ensure the grades awarded to students are authentic.
"It's crucial that the graduates that we send out in the world are able to do what they're supposed to do otherwise the awards that we give to students will lose their value," he explained.
"When a student cheats, they're not only cheating themselves but they're also disadvantaging their peers in their class by getting an advantage that they don't deserve," he added.
Emma Muldoon Ryan, Vice-President of Academic Affairs at AMLÉ, the national students' union, said that students need clarity on when and where it is appropriate for them to use AI.
"Some methods of assessment are outdated and the concern for cheating does feed into that," she said.
She added that students also know that jobs they take after university may require them to be familiar with AI tools.
"There's also students who see the sustainability issues within AI.
"Most are looking for adoption in a controlled manner," Ms Muldoon Ryan said.
Dr Tonra said that the University of Galway recognises that many lecturers want to use generative AI in their teaching and assessment of a course.
Professor of Computer Science Michael Madden teaches AI to his students at the university but he said not until they are at a relatively senior level in their studies.
"When students understand principles well and they understand what they want to achieve and the capabilities of the programming languages and how they could approach and design and manage things, then we can see, 'Ok, then how can you use generative AI to improve your productivity?'" Dr Madden said.
"The whole goal of any assignment is that these are artefacts that the students generate that reflect the student understanding, so whenever a student tries to bypass that and just produce the output without having the understanding, then that's obviously a problem.
"Right now, that's happening in universities across the world with tools like ChatGPT and others.
"The source of the solution isn't the problem, it's the lack of learning by the student," he added.
As AI becomes another digital tool in many people's professional lives, students may need to know how and when to use it.
Students have mixed feelings on using AI
Seán de Búrca, Education Officer at the University of Galway, said that the absence of a clear AI policy for students "keeps coming up".
"What we see is in some modules, students are told to use AI other ones, they're told, can't use it," he said.
"The result there is that students can end up using AI in the module where they're not supposed to," Mr de Búrca said.
Students at the University of Galway said they had different experiences of using AI, depending on what they were studying.
A student of mathematics and computer science said she was encouraged to innovate with AI by some lecturers and told to steer clear by others.
Other students were warned against using AI and told they would be caught if they did.
A commerce student said that he wanted AI to be taught in his course because he had have to use it when he joined the workforce.
A psychology student said that she feels the sustainability questions around AI are not discussed enough.
"Nobody's talking about the harm that it's doing to the environment and about like ethics," she said.
"Especially in psychology, you have to have real social skills and you have to provide ideas that are individual for every patient and every human being."
A student of English and journalism said that he was given the option to do an assignment with an AI app but the assignment "felt pointless" as generative AI apps are designed to be easy to use.
Another student agreed and said "it's hard to wrap the head around how he could be graded on something that's meant to be as user-friendly as possible".
When the students were asked if they wanted to use AI when they begin working, most students said no.
One commerce student said however, that he did because it would allow him to do better work.
"For universities trying to push students away from AI, it'd be like a trade school trying to push a carpenter away from a saw," he said.
"It's going to be what's required in the future," he added.