Universities in South Africa must get a handle on AI

Durban University of Technology lecturer Siphumelele Zondi says universities in South Africa should create policies regarding their students’ use of artificial intelligence (AI).
Speaking to 702, he said the policies would need to balance preventing students from using AI to cheat and letting them use it for appropriate purposes.
“Universities need to create AI policies, because at this present moment it’s being treated as plagiarism,” said Zondi.
He explained that some students attempt to use AI to write papers for them, which constitutes plagiarism as they present an AI model’s work as if they’ve done it themselves.
“There are also good ways of working with AI; maybe universities need to determine what those are,” he added.
He suggested that one such beneficial use could be letting students use AI to assist them during the ideation stage of their assignments, rather than using it to write their entire papers.
“After the ideation process, then go and do the research themselves and then go figure out how they can alter whatever AI would have given to them as an idea or in the ideation process,” he said.
Zondi warned students to be careful when using technology, as it tends to make things up and present them as facts.
“I always caution students to be careful because AI can have hallucinations,” he said.
“Because AI would seldom say ‘I do not know this thing’, it will make up something.”
Zondi also warned that AI will get smarter, and if universities can’t control the technology’s use, they risk allowing students who have gained little knowledge of their subject matter to graduate.
“They use AI to pretend that they have knowledge of a particular thing when they’re writing a paper, then go out into the world without much knowledge,” he said.
“They could go into a profession where they could be dealing with people’s lives.”
A significant threat to degree and diploma credibility

According to Andy Carolin, an associate professor at the University of Johannesburg’s Department of English, universities’ ability to detect AI in students’ work may not be as good as it appears.
He said AI detectors, such as Grammarly or Scriber, only provide a surface claim of AI use.
Carolin warned that many students don’t even attempt to engage with their course’s prescribed material and instead opt to go straight to AI.
“I think that we’re much closer to a crisis than many of us are willing to acknowledge, and it’s worth pointing out that this is not fear-mongering,” he said.
“I have no doubt students use ChatGPT to submit written work, but what sets the large language model apart from others is that there is just no way to prove it.”
Carolin said many students are graduating with degrees, without having ever been assessed or taught critical thinking.
“We risk an ever-increasing number of students who hold certificates that fraudulently certify their mastery of skills and content knowledge that some may have only barely attempted,” he said.
Carolin suggested that having examinations written in person at dedicated venues is one way to maintain academic integrity.
Many tertiary institutions haven’t reverted to these examination forms after the Covid-19 pandemic.