![]() |
Oksana Romanov |
Think of AI for lawyers by analogy. Lawyers practising across different areas of law now have an opportunity to become content creators at another level. But this gift comes with a caveat: legal creators must be ethically and professionally responsible for the content they produce. Lawyers must curate legal outputs rather than delegate everything to AI.
My interest in generative AI stems from our shared duty as lawyers to maintain technical competence. While designing AI Hallucinations in Court: Canadian Case Law Update, an accredited introductory CPD course, I explored both the ethical considerations and practical applications of AI-enabled tools in the legal context. When I

Aparna Sinha: ISTOCKPHOTO.COM
Defining AI hallucinations
AI hallucinations are errant, non-existent legal authorities, fabricated by an AI platform in response to a human prompt, which misstate or misrepresent the law.
Simply put, AI hallucinations are errant, “non-existent or fake precedent court cases” or citations “fabricated by an AI platform” in response to a user prompt: Ko v. Li, [2025] O.J. No. 2197, at paras. 3 and 5, per Justice Fred Myers. In R. v. Chand, [2025] O.J. No. 2288, at para. 2, Justice Joseph Kenkel also described these errant cases as “fictitious.”
AI hallucinations can also be described as “false citations.” See Pacific Smoke International Inc. v. Monster Energy Co., [2024] T.M.O.B. No. 5211 at para. 16, citing Ghassai v. Industria de Diseno Textil, S.A., [2024] T.M.O.B. No. 5150, at para. 5.
They may also take the form of “non-existent judicial opinions with fake quotes and citations” created by an AI tool: Mata v. Avianca, Inc., 678 F. Supp. 3d 443, 448 (S.D.N.Y. 2023). In essence, these errant cases “mis-state or misrepresent the law to the court”: Ko v. Li, at para. 22. As courts and tribunals have emphasized, “[w]hether accidental or deliberate, reliance on false citations is a serious matter [see Zhang v. Chen, [2024] B.C.J. No. 305]”: Diseno Textil, at para. 6.
There are other definitions of AI hallucinations available. For example, the Ontario Bar Association offers one in its AI Glossary. You may benefit from adding this and other resources to your legal toolbox.
How to spot AI hallucinations
There are several ways to spot an AI hallucination in submissions. The list below is not exhaustive, but it is intended as a starting point.
1. The case is “on all fours” in terms of facts and legal issues. If it looks too good to be true, verify the citation.
2. Legal citation is either too generic or unconventional. Query it through one of the legal research databases available to you.
3. The case is cited for its general proposition, and there is no specific paragraph listed. Find out what it stands for. As a best practice, in-case citations should include a pinpoint cite to the paragraph that illustrates the point being made. For instance, under certain practice directions in Ontario, courts require counsel to hyperlink case law in electronically filed documents.
2. Legal citation is either too generic or unconventional. Query it through one of the legal research databases available to you.
3. The case is cited for its general proposition, and there is no specific paragraph listed. Find out what it stands for. As a best practice, in-case citations should include a pinpoint cite to the paragraph that illustrates the point being made. For instance, under certain practice directions in Ontario, courts require counsel to hyperlink case law in electronically filed documents.
Best practices for using generative AI
Treat AI like an assistant. You may delegate tasks such as cite-checking, editing and preliminary research. You would not delegate your responsibilities to someone untrained in the law, no matter how intelligent or enthusiastic they may be. Therefore, AI-based tools must be used ethically and responsibly. Always supervise your AI assistant by verifying the final product. Ultimately, your professional reputation and your livelihood depend on the accuracy and integrity of your legal work.
Next-generation AI-powered legal tools provide legal research, document analysis, document drafting, practice tips and workplace integrations. These new technologies often incorporate built-in verification workflows that enable lawyers to trace the cited information, thereby enhancing the accuracy and reliability of their legal outputs.
Oksana Romanov, BA (Hons), MA (Comm), JD with Distinction, is a sole practitioner practising criminal law as Law Office of Oksana Romanov.
The opinions expressed are those of the author(s) and do not necessarily reflect the views of the author’s firm, its clients, Law360 Canada, LexisNexis Canada or any of its or their respective affiliates. This article is for general information purposes and is not intended to be and should not be taken as legal advice.
Interested in writing for us? To learn more about how you can add your voice to Law360 Canada, contact Analysis Editor Peter Carter at peter.carter@lexisnexis.ca or call 647-776-6740.