In a brief July 10 decision in Lloyd’s Register Canada Ltd. v. Choi, 2025 FC 1233, Federal Court Justice Simon Fothergill ordered the respondent, Munchang Choi, to pay the applicant, Toronto-based Lloyd’s Register Canada Ltd., $500 in all-inclusive costs for what the former Lloyd’s Register employee characterized as an error.
“The undeclared use of AI in the preparation of documents filed with the Court, particularly when they include the citation of non-existent or ‘hallucinated’ authorities, is a serious matter,” wrote Justice Fothergill.
“Removal of the abusive Motion Record from the Court file,” he added, “is necessary to preserve the integrity of the Court’s process and the administration of justice.”
Although the applicant, Lloyd’s Register, did not request costs on a solicitor-client basis, Justice Fothergill referred to the Federal Court of Appeal’s 2004 decision in NM Paterson & Sons Ltd. v. The St. Lawrence Seaway Management Corp., 2004 FCA 210 to conclude that a “party who assists the court in ensuring the orderly administration of justice should not have to suffer costs.”
Lloyd’s Register Canada is the Canadian branch of U.K.-based Lloyd’s Register, a global professional services company specializing in engineering, certification and compliance services for the maritime and offshore industries.
According to facts detailed in the decision, Choi also relied on AI research in the underlying Canada Industrial Relations Board (CIRB) case, Choi v. Lloyd’s Register Canada Ltd., 2024 CIRB 1146.
The case involved Choi’s previous job as a senior marine surveyor for Lloyd’s Register until March 2023, when his employment with the company was terminated. He filed a complaint with the CIRB after facing disciplinary action, including a written warning and termination, which he claimed were reprisals for raising safety concerns, refusing unsafe work and filing grievances.
The CIRB dismissed Choi’s reprisal complaint as untimely and provided guidance on using AI for legal submissions, stressing the need for parties to verify AI-generated content for accuracy and reliability. CIRB vice-chairperson Jennifer Webster, who presided over the case, noted that Choi had misrepresented more than 30 legal authorities and legal principles in his reply submissions to the board.
In a Federal Court action challenging the CIRB decision, Lloyd’s Register Canada applied pursuant to Rule 74 of the Federal Courts Rules, SOR/98-106, for an order removing the motion record filed by Choi in March of this year from the court file on the grounds that it was scandalous, frivolous, vexatious and otherwise an abuse of process.
Choi had cited the case Fontaine v. Canada, 2004 FC 1777, which does not exist, in his motion record. He told the court that he had intended to refer to Fontaine v. Canada (Attorney General), 2012 ONCA 206 and had made a mistake in writing the citation.
Describing himself as a self-represented litigant with mental health issues, he asked the court not to remove his motion record from the court file and to refrain from imposing punitive measures.
But the explanation garnered little sympathy from the court or the applicant.
“The Respondent’s explanation for citing a non-existent case that was apparently generated by AI is unsatisfactory and cannot be accepted by the Court,” wrote Justice Fothergill.
“The Applicant,” he added, “says that the explanation provided by the Respondent for citing a non-existent case defies belief.”
While the decision is unfortunate, it appears reasonable given the factual context, said Jennifer Leitch, executive director of the National Self-Represented Litigants Project (NSRLP), based at the University of Windsor Faculty of Law, and associate director of the Ethics, Society & Law program at Trinity College at the University of Toronto.
Considering the challenges associated with self-representation, it’s not surprising that people would turn to AI for assistance, she told Law360 Canada in an email.
The NSRLP estimates that about 50 per cent of civil cases in Canada now involve self-represented litigants (SRLs), with the trend driven in large part by spiralling legal costs.
But the issue of AI hallucinations in legal research goes beyond self-represented litigants, said Leitch.
“Lawyers, like the ones in the Ko v. Li decision, are also including hallucinated cases with less excuse,” she noted.
Ko v. Li, 2025 ONSC 2766 is a landmark Ontario Superior Court case highlighting the risks of AI hallucinations in legal practice.
“In this sense,” said Leitch, “I would note the difference between an SRL who relies on AI-generated misinformation inadvertently (and with little other choice) and one who is repeatedly and recklessly relying on it notwithstanding being aware of the problems.”
She said there is a great and growing need for public legal education around the use of AI so SRLs are not relying on misinformation, and hallucinated cases are not inadvertently finding their way into court materials. The NSRLP is now in the process of developing such an education program, she added.
Counsel for the applicant, Lloyd’s Register Canada Ltd., was Michelle Lahey, an associate with Halifax-based Cox & Palmer. She was not able to provide a comment prior to press time.
If you have any information, story ideas or news tips for Law360 Canada, please contact John Schofield at john.schofield1@lexisnexis.ca or call 905-415-5815.