Hallucinations

By Connie L. Braun

Law360 Canada (June 19, 2023, 12:17 PM EDT) --
Connie Braun
In recent months, the world has seen a lot of news and discussion about using generative Artificial Intelligence (AI) tools such as ChatGPT in various professions, including legal practice. A lawyer in New York, for example, found himself in trouble with the court for having used the ChatGPT platform to prepare a federal court filing in which ChatGPT invented cases that the lawyer cited to support his argument. The lawyer had simply accepted what ChatGPT produced in terms of case citations and judicial opinions.

Using standard legal resources, opposing counsel and the judge soon discovered that the cases and judicial opinions did not exist. When asked to provide copies of the cases, the lawyer was unable to produce them. The lawyer’s failure to verify and validate the faked cases via the standard legal resources demonstrates laziness and willingness to rely on what proved to be a mirage. In the world of ChatGPT, this kind of laziness results in what is known as “hallucinations,” information and communication comprising confident, persuasive writing that is inaccurate and misleading.

ChatGPT is built on deep-learning techniques using complex algorithms and natural language processing. It is, however, not sentient, unable to express emotions or motivations, and is not designed to replace human writers. What it will do is generate text that is grammatically correct, appears to be contextually relevant, and is engaging. And, as the lawyer in New York learned, inventive.

ChatGPT was introduced late in 2022 as part of technologies associated with AI. While a conversation with tools like ChatGPT can feel normal and very real, the technology is surprisingly inaccurate, often just inventing facts and sources for facts that do not exist. Even so, this understanding of the technology has not stopped people of all ages, professions, and callings from experimenting with the technology, often insisting that it is reliable, dependable, and unfailing.

ChatGPT and other similar AI tools are being used in the practice of law today. As with any tools, especially when relatively new, great care must be taken to ensure that the tools are a small part of the work that is undertaken and completed. Relying solely on tools like this to provide the truth, an accurate interpretation or to support an argument is to make a big mistake. Failure to validate the facts, citations and commentary, when having access to the appropriate resources, is unacceptable behaviour in the legal world. This means that humans need to review each and every word, sentence or argument that is generated via these tools, verifying authenticity.

Under the guidance of practising lawyers and other legal professionals, law students begin to hone their skills at conducting research, writing legal documents and developing legal arguments to present. Validating facts, citations and commentary is key to ensuring that identified cases represent good law and provide support for a client’s case. This kind of validation is among the earliest tasks that law students learn and perform during their time in summer jobs in firms or the courts. It is this apprenticeship that goes a long way toward preparing students to practise law successfully and well.

Imagine then, the individual who chooses to self-represent in court. Risky, yet occurring with greater frequency because the cost of hiring a lawyer is too high. Access to justice is difficult enough to obtain when this is the reality, let alone having access to legal resources or being able to use them competently.

There are some resources that can help but still may not be enough. In Alberta, individuals may take advantage of the services and collections of resources available to all residents as provided by the Alberta Court Law Libraries. These services include assistance in finding and researching cases, locating forms and precedents, and identifying legislation. Similar services are available in other jurisdictions throughout Canada. The Canadian Judicial Council provides access to guides to assist the self-represented individual. None of these services, however, provide legal advice.

For the self-represented individual, the temptation to use a tool like ChatGPT will be immense because the content it produces feels right, it feels good, and the arguments made are so compelling, thus, the “hallucinations.”

Going down this path is so very dangerous because individuals without legal training or knowledge of legal processes, let alone an understanding of the law, will want to believe what is presented to them, no matter how incorrect or inaccurate. It does not take much imagination to see that many self-represented individuals will fail in their efforts because the arguments generated by a tool like ChatGPT are completely wrong.

The fact that the New York lawyer, who has the legal education and ability to use the available resources competently, chose not to verify and validate the citations and judicial opinions shows poor judgment on his part. Ignorance is not an excuse in this situation. For the self-represented individual, stories like this should be a warning to proceed with caution. No one, legal professional or self-represented individual, can afford to be swayed by inaccuracies or made-up facts represented by hallucinations.

Lives may depend on it.

Connie L. Braun is a product adoption and learning consultant with LexisNexis Canada.
 
The opinions expressed are those of the author(s) and do not necessarily reflect the views of the author’s firm, its clients, Law360 Canada, LexisNexis Canada, or any of its or their respective affiliates. This article is for general information purposes and is not intended to be and should not be taken as legal advice.  

Photo credit / Marcio Binow Da Silva ISTOCKPHOTO.COM 

Interested in writing for us? To learn more about how you can add your voice to Law360 Canada, contact Analysis Editor Peter Carter at peter.carter@lexisnexis.ca or call 647-776-6740.