ChatGPT cannot be a ‘panacea’ to access to justice, LCO webinar speaker notes

By Amanda Jerome

Law360 Canada (April 4, 2023, 1:45 PM EDT) -- The benefits and pitfalls of ChatGPT were highlighted in a Law Commission of Ontario (LCO) webinar, where speakers discussed the technology’s potential uses, ethical obligations and the impact on access to justice. However, as speaker Amy Salyzyn noted, ChatGPT cannot be a “panacea” to access to justice as “lawyering is more than just generating language.”

“There’s a human element, there’s an emotional intelligence element,” she stressed, noting that especially vulnerable people need a human lawyer.

The webinar, entitled “The ChatGPT Lawyer: Opportunities and Risks of Generative AI for the Legal Profession,” was hosted by the LCO in partnership with University of Ottawa Centre for Law, Technology, and Society on March 8. The speakers included LCO counsel Ryan Fritsch; Salyzyn, an associate professor in the Faculty of Law, Common Law, at the University of Ottawa; Colin Lachance at Jurisage, AltaML, and Daniel W. Linna Jr., a senior lecturer and director of Law and Technology Initiatives at Northwestern Pritzker School of Law and McCormick School of Engineering. The panel was moderated by LCO executive director Nye Thomas.

 Amy Salyzyn, University of Ottawa

Amy Salyzyn, University of Ottawa

ChatGPT, an AI-driven natural language processing tool, is being used in a variety of ways, Linna explained, noting it can “generate text in response to a prompt, revise writing, summarize text, generate ideas, answer questions in a convincing language — although it might not have a connection to the ground truth or to underlying facts, maintain a conversation,” as well as write and debug code.

He noted that “these large language models can be very helpful in analyzing, revising documents,” as well as “summarizing law, court opinions, contracts and other documents.”

ChatGPT has “use cases for legal research,” Linna added, noting it could be useful in the access to justice space.

Linna believes that the benefits of this technology can “transform the way we practice law.” However, he noted, people may question whether the technology will “eliminate” lawyers’ jobs.

“I think that’s the wrong question. The question is: when are we going to use these tools to transform the way that we think about legal services, legal systems, and rule of law in society?” he asked, highlighting “concerns about bias in these systems.”

“ChatGPT was trained on a huge corpus of language across the Internet. Some of that language is language that we don’t want [to be] part of our legal system,” he explained, noting that there needs to be “guardrails,” as well as further evaluation and testing.

“You can think of these tools” as “confined to consist with the norm,” Linna said, stressing that these tools should not only work for the “norm in society.”

“We want them to work for all groups across society” for the development of a “chatbot that can be used for legal services,” he added.

Fritsch believes there’s a “legitimate concern for access to justice in using these tools.”

 Ryan Fritsch, Law Commission of Ontario

Ryan Fritsch, Law Commission of Ontario

“Obviously, these tools are really set up because people have a lot of difficulty getting legal answers to everyday legal problems,” he said, pointing to Linna’s reflection that “these models are trained off language scraped off the Internet,” which “can reflect social biases.”

“There’s a danger, I think, in particularly vulnerable groups relying on these kinds of tools and the kinds of advice that they’re given,” he added, questioning how much of ChatGPT’s training “reflects the interests of women.”

“For example, if I asked ChatGPT what I can do if I've been a victim of assault. ChatGPT might not give me information that’s relevant to women, it might be generic information about, ‘well, you can go to the police and make a complaint.’ But if I'm a victim of domestic violence, I want to know where I can find a shelter. I want to know how I can extract myself from a violent relationship safely,” he explained, highlighting the gap between generic answers from ChatGPT and legitimate legal advice from a lawyer.

From a legal ethics standpoint, Salyzyn highlighted “professional obligations” that lawyers need to be aware of.

She noted that “lawyers have an obligation to provide competent legal services” and that a lawyer “simply opening up ChatGPT on their computer and using that to funnel legal advice directly to a client is going to get into a lot of trouble pretty fast.”

Confidentiality was another issue she emphasized, noting that information inputted into ChatGPT is not private.

“Efficiency,” she noted, is sometimes missed as “an ethical obligation for Canadian lawyers who have” rules that clearly state, “an obligation to provide efficient legal services.” Salyzyn said there’s a potential “tipping point” in terms of being obligated to use tools, such as ChatGPT, if they become sufficiently reliable and can help lawyers work faster.

“Lawyers are going to need to know which tools are reliable and which tools are good. If they’re going to start to incorporate that into their practice, they’re going to have responsibility over that. So, there’s a due diligence point there as well, that lawyers need to make sure that they know what they're using when they’re using it,” she stressed.

Building on Salyzyn’s comments, Fritsch said there are also “corollary issues of fairness among all of the litigants, particularly self-represented litigants,” when it comes to the use of ChatGPT.

“I think there’s some concern that we could end up very soon with a set of two-tier tools: the free tools that are available online and tools that are paid that might be more tailored to addressing legal issues and providing better, more coherent and reliable answers,” he explained, noting that in a scenario such as this, self-represented litigants that could afford better tools could show up to court “with better information, a stronger and possibly more compelling argument.”

The access to justice issue Fritsch sees in the ChatGPT conversation is making “tools available to all litigants.”

“There’s a massive access to justice gap in Canada. Millions of people cannot get answers to legal questions that arise in the everyday,” he explained, noting that while ChatGPT is “not as good” as a human, there’s a massive demand for this kind of technology and assistance.

ChatGPT, he noted, is “good at identifying and pointing you” in a general direction, which “has a lot of value for a lot of people.”

“Folks don’t know what their legal rights might be, they don’t know what ballpark they might be playing in with a particular concern. So, the fact that this thing is able to triage and act as a bit of an issue identification is really helpful,” he added.

Fritsch also sees ChatGPT being beneficial for “access to dispute resolution,” such as in the workplace accommodation context.

Fritsch concluded with “two points of warning.”

“The first is that the demand for tools like ChatGPT is inarguable and the need is also inarguable, but I don’t think we want to resile too far from ensuring that access to justice is both cheap, and fast, and good. We have to make sure that the advice that people are getting is still good and to a professional standard. Making tools like this available can’t be a tick box exercise,” he said.

Secondly, he noted, is the “danger to lay persons” using this technology “because ChatGPT is an information tool different from anything they’ve ever seen before.”

“When I go to Google and do a search for something, Google gives me results, not answers. And as a person, I can parse through those results, I can look at the context of who’s providing the information, who the author is. I can make choices about which information I rely on, and I can see related information. ChatGPT doesn’t give you results; it gives you answers. So, it doesn’t have context, it doesn’t automatically connect you to related areas,” he added, noting this “can result in people getting tunnel vision and over relying on the tool without knowing that there could be a broader issue there.”

If you have any information, story ideas or news tips for Law360 Canada, please contact Amanda Jerome at or 416-524-2152.