AI3.07.2025

Another South African lawyer caught using AI has landed in big trouble

A case involving a junior advocate who used an artificial intelligence (AI) tool called Legal Genius has been referred to the Legal Practice Council for investigation after the chatbot hallucinated case law.

The case concerned the urgent release of a refining licence linked to a disputed business sale of Rappa Resources to Northbound Processing.

Northbound Processing brought an urgent application against the South African Diamond and Precious Metals Regulator to release the licence, saying it would suffer devastating commercial consequences otherwise.

Specialist digital, data, and technology law firm Michalsons Giles Inc said the case was significant for its outcome and because it highlighted the consequences of relying on fictitious authorities created by AI tools.

Michalsons associate, Refilwe Motsoeneng, said the case confirmed that once a licence is issued and communicated, it cannot be withheld without a proper legal process, which is important for managing risk in regulated sectors.

“The case provides clarity on how regulators must act once they make and communicate a decision,” said Motsoeneng.

“Unless a decision is formally challenged in court and set aside, it remains valid and must be acted on. This is important for any business that depends on government licences or approvals to operate.”

However, more interesting to the broader South African legal profession was that this was one of the first South African judgments to deal directly with the misuse of generative AI in legal proceedings.

“It reflects a growing trend in local and international courts. Judges are warning legal professionals about the risks of relying on unverified AI-generated content,” said Motsoeneng.

“This case reinforces the ethical duty to independently verify all legal sources before citing them. Even unintentional AI ‘hallucinations’ can cause reputational harm and lead to professional misconduct complaints.”

Judge confronts junior counsel

Screenshot of the Legal Genuis website

In his ruling, acting judge DJ Smit reported that he had given Northbound’s legal team an opportunity to explain two non-existent cases that appeared in their heads of argument.

A junior advocate responded that “confusion arose from short-form citations used during drafting” and that different citations were “initially intended for inclusion in support of the relevant legal propositions”.

However, after the junior counsel removed the two offending paragraphs Smit had identified, an attorney acting for the opposition noted that at least two more incorrect citations remained.

When the judge asked the junior counsel directly whether these were AI hallucinations, he admitted using Legal Genius, a tool claiming it was “exclusively trained on South African legal judgments and legislation.”

He also said the urgency of the application and the unavailability of the junior counsel who had initially drafted the heads of argument had caused severe time pressure.

The junior counsel apologised unreservedly and accepted full responsibility for the mistakes, but emphasised that there was no intent to mislead the court.

He also said the senior counsel’s oral arguments did not rely on the non-existent cases and that no one was prejudiced due to the errors in Northbound’s heads of argument.

Northbound’s senior counsel, Arnold Subel, also apologised unreservedly for the oversight on behalf of the legal team. He said it was inconceivable that the citations had been identified with an AI tool.

He said he relied upon an experienced legal team, which included two competent junior counsel,  upon whom he believed he could — and indeed did — rely.

Subel said he only did a “sense-check” on Northbound’s heads before they were filed and did not have sufficient opportunity to check the accuracy of the citations.

He said he considered the propositions to which they related to be trite and did not even require case law references.

While Smit ruled in favour of Northbound, he referred the case law hallucinations included in the heads of argument to the Legal Practice Council for investigation.

“In my view, it matters not that such cases were not presented orally, but were contained in written heads of argument,” Smit stated.

“Written heads are as important a memorial of counsel’s argument as oral argument and, for purely practical reasons, are often more heavily relied upon by judges.”

AI just a tool

Johannesburg High Court

Motsoeneng said the case was a reminder that large language models and other AI-powered services were merely tools.

“They cannot replace proper legal research or professional oversight. Courts expect high standards, especially in urgent applications where they rely on written submissions,” she said.

“Heads of argument are not secondary; they form part of the record and carry weight.”

This is the third publicly known case in South Africa where legal practitioners were found to have used large language models (LLMs) to help write court papers.

The first happened in 2023, when legal representatives for a Sectional Title Scheme relied on ChatGPT to argue that body corporates could not be sued for defamation.

Because the technology was still relatively new, the legal team got away with a punitive costs order and was not referred to the Legal Practice Council.

The second occurred in 2025 in an appeal brought by Umvoti Mayor Godfrey Mavundla against a ruling regarding his suspension. Mavundla alleged his suspension was decided during an unlawful council meeting.

Judge Elsje-Marie Bezuidenhout said she found that only two of the nine cases cited in papers put before the court could be found to exist, adding that the citation for one of them was incorrect.

She concluded that Mavundla’s law firm had likely used generative AI to source the fake legal citations, describing the action as “irresponsible and downright unprofessional”.

As a result, she referred the matter to the Legal Practice Council for possible further action.

Judge Smit endorsed the reasoning in the Mavundla case and said that South African courts must bear in mind the provisions of Article 16(1) of the Code of Judicial Conduct.

“This provision obliges a judge with clear and reliable evidence of serious professional misconduct or gross incompetence on the part of a legal practitioner to inform the relevant professional body of such misconduct,” Smit said.

Show comments

Latest news

More news

Trending news

Sign up to the MyBroadband newsletter