South African lawyers use ChatGPT to argue case — get nailed after it makes up fake info

Lawyers arguing a case at the Johannesburg regional court have been called out in a judgement for using fake references generated by ChatGPT.

“The names and citations are fictitious, the facts are fictitious, and the decisions are fictitious,” said the judgement.

According to the Sunday Times, the judgement also delivered the lawyers’ client with a punitive costs order.

“When it comes to legal research, the efficiency of modern technology still needs to be infused with a dose of good old-fashioned independent reading,” said Magistrate Arvin Chaitram of the situation.

The case in question involved a woman who was suing her body corporate for defamation.

The counsel for the trustees of the body corporate argued that a body corporate could not be sued for defamation, and the counsel for the plaintiff, Michelle Parker, said there were earlier judgements that answered the question — they had just not had the time to access them.

Magistrate Chaitram postponed the case to late May to give both parties ample time to source the information they needed to prove their cases.

In the two months that followed, the various lawyers involved in the case tried to track down the information the lawyers had referred to.

Instead, they found that although ChatGPT had referred to actual cases and given real citations, the citations related to different cases than the ones named.

Additionally, these cases and citations were not at all appropriate to defamation suits between body corporates and individuals.

It was then admitted that the judgements had been sourced “through the medium of ChatGPT.”

Chaitram ruled that the lawyers had not intended to mislead the court — they were “simultaneously simply overzealous and careless.”

This meant no further action was taken against the lawyers beyond the punitive costs order.

“The embarrassment associated with this incident is probably sufficient punishment for the plaintiff’s attorneys,” said Chaitram.

ChatGPT’s made-up cases

South Africa is not the first country where lawyers uncritically believed ChatGPT and used it to supplement their work.

In the US, lawyers behind a court brief full of false case citations from ChatGPT were fined $5,000 (R93,045) last month.

US District Judge P. Kevin Castel found the lawyers — Steven Schwartz and Peter LoDuca — and their firm had “abandoned their responsibilities when they submitted non-existent judicial opinions with fake quotes and citations created by the artificial intelligence tool ChatGPT, then continued to stand by the fake opinions after judicial orders called their existence into question.”

Schwartz and LoDuca had filed the brief on behalf of Roberto Mata — a client who claimed he had been injured on a 2019 flight from El Salvador to New York. Mata’s suit was thrown out of court, too, as it was filed too late.

Castel also ordered the lawyers to send a transcript of the hearing, and Castel’s opinion, to each of the judges falsely identified as authors by ChatGPT.

“The Court will not require an apology from respondents because a compelled apology is not a sincere apology,” added Castel.

“Any decision to apologise is left to respondents.”


Now read: OpenAI makes GPT-4 API generally available

Latest news

Partner Content

Show comments

Recommended

Share this article
South African lawyers use ChatGPT to argue case — get nailed after it makes up fake info