Fav of CanadaFav of Canada
  • Home
  • News
  • Money
  • Living
  • Entertainment
  • Health
  • Sci-Tech
  • Travel
  • More
    • Sports
    • Web Stories
    • Global
    • Press Release

Subscribe to Updates

Get the latest Canada's trends and updates directly to your inbox.

What's On
Teen girl injured in tragic incident on school trip, clothes caught in chairlift

Teen girl injured in tragic incident on school trip, clothes caught in chairlift

February 13, 2026
Canada tops Switzerland in Olympic men’s hockey

Canada tops Switzerland in Olympic men’s hockey

February 13, 2026
Manitoba sees 44 measles cases in first week of February

Manitoba sees 44 measles cases in first week of February

February 13, 2026
Facebook X (Twitter) Instagram
Fav of CanadaFav of Canada
  • Home
  • News
  • Money
  • Living
  • Entertainment
  • Health
  • Sci-Tech
  • Travel
  • More
    • Sports
    • Web Stories
    • Global
    • Press Release
Fav of CanadaFav of Canada
You are at:Home » AI use in Canadian courtrooms carries risk of errors, penalties: lawyers
News

AI use in Canadian courtrooms carries risk of errors, penalties: lawyers

By favofcanada.caDecember 31, 2025No Comments7 Mins Read
Facebook Twitter Pinterest Telegram WhatsApp Email Tumblr LinkedIn
AI use in Canadian courtrooms carries risk of errors, penalties: lawyers
Share
Facebook Twitter Pinterest WhatsApp Email
AI use in Canadian courtrooms carries risk of errors, penalties: lawyers

In the past, if a client who usually preferred to communicate via short emails suddenly sent a lengthy message akin to a legal memo, Ron Shulman would suspect they’d received help from a family member or partner. Now, the Toronto family lawyer asks clients if they’ve used artificial intelligence.

And most of the time, he says, the answer is yes.

Almost every week, his firm receives messages written or driven by AI, a shift Shulman says he noticed in the last several months.

While AI can effectively summarize information or organize notes, some clients seem to be relying on it “as some sort of a super intelligence,” using it to decide how to proceed in their case, he said.

“That forms a significant problem,” since AI isn’t always accurate and often agrees with whoever is using it, Shulman said in a recent interview.

Some people are now also using AI to represent themselves in court without a lawyer, which can delay proceedings and escalate legal costs for others as parties wade through reams of AI-generated materials, he said.

As AI infiltrates more and more aspects of daily life, it is increasingly making its way into the courts and legal system.

Materials created with platforms such as ChatGPT have been submitted in courts, tribunals and boards across Canada and the United States in the last few years, at times landing lawyers or those navigating the justice system on their own in hot water over so-called “hallucinations” – references that are incorrect or simply made up.

In one notable case, a Toronto lawyer is facing a criminal contempt of court proceeding after including cases invented by ChatGPT in her submissions earlier this year, then denying it when questioned by the presiding judge. In a letter to the court months later, the lawyer said she misrepresented what happened out of “fear of the potential consequences and sheer embarrassment.”

Get the day's top news, political, economic, and current affairs headlines, delivered to your inbox once a day.

Get daily National news

Get the day’s top news, political, economic, and current affairs headlines, delivered to your inbox once a day.

AI hallucinations can come with a financial cost as well as a reputational one.

In the fall, a Quebec court imposed a $5,000 sanction on a man who turned to generative AI to help prepare his filings after parting ways with his counsel. Not long after, Alberta’s top court ordered additional costs of $500 against a woman whose submissions included three fake authorities, warning that self-represented litigants could expect “more substantial penalties” in the future if they didn’t abide by the court’s guidance on AI.

Courts and professional bodies in several provinces have issued guidelines on the use of AI, with some – including the Federal Court – requiring that people declare when they have used generative models.

Some lawyers who have used or encountered AI in their work say it can be a helpful tool if deployed judiciously, but when used improperly, it can compromise privacy, bog down communication, erode trust and rack up legal costs, even when no financial penalties are imposed.

Ksenia Tchern McCallum, a Toronto immigration lawyer licensed to practice in both Canada and the U.S., said she’s seeing more people come in with research or even completed applications done with AI that they then want her to review.

At other times, clients are using AI to “fact check” her, running documents she’s prepared through a platform, potentially exposing their personal information as well as undermining their confidence in her work, she said.

“It can put a lot of strain on client relations because if I’m instructing my client to do something and they’re second-guessing me or telling me, ‘Well, I don’t think I need to or why do I need to do this?’ and they’re fighting back … then how am I supposed to represent you and your best interest?” McCallum said.

“AI can scout the internet and tell you typically what’s part of this process, but my experience and my knowledge of what works and doesn’t work in these processes is what the AI is not going to be able to catch.”

Online forums for those dealing with immigration issues also encourage people to use AI to prepare filings and save on legal fees, she said.

“They submit that material, and then the court’s like, ‘OK, we see that you used AI, you didn’t disclose it. But not only did you not disclose it, you’re actually referring to cases that don’t exist, you refer to pathways that don’t exist, you’re citing law that’s not relevant,” McCallum said.

“People are actually getting costs awarded against them because they’re coming to court self-represented, thinking that AI is going to draft these beautiful factums for them, but without knowing that this is not what’s supposed to happen.”


Trying to save money through AI can sometimes have the opposite outcome, said Shulman, the family lawyer.

A client recently sent over five or six pages of AI-written material on exclusive possession – the right of a married couple to stay in the matrimonial home – essentially directing the firm to include it in court submissions, he said. The problem? The client wasn’t married, so it didn’t apply.

“You’ve just spent half an hour …  of fees to read something (when) it’s no good to start with,” he said.

Shulman said he now has a basic disclaimer he gives clients, letting them know he has to read everything they send. He also encourages clients to ask him to explain legal concepts rather than turning to AI — or at least let him show them how to use AI more effectively.

There is an appetite for this kind of guidance and information, said Jennifer Leitch, executive director of the National Self-Represented Litigants Project, an organization that advocates and develops resources for self-represented litigants.

The organization held a webinar last month to help those without a lawyer use AI appropriately and safely in their cases, drawing some 200 people, she said, adding more sessions are planned for the new year.

Leitch said she views it almost as a form of harm reduction: “People are going to use it, so let’s use it responsibly.”

Her advice includes checking any cases referenced by AI to make sure they exist and are quoted correctly, looking up the court’s guidance on AI and making sure to stay within the length limits for filings.

AI has the potential to improve access to justice by allowing people to tap into a wealth of information and assistance in organizing their case, but currently it’s “a bit of a Wild West,” particularly when it comes to reliability, Leitch said.

“For lawyers in law firms, there’s amazing AI programs that help with practice management, research, drafting, but they’re all sort of behind paywalls,” she said.

“But the stuff that’s out there open source is …  less reliable, you run those risks of hallucination and mistakes, etc., that aren’t there in the programs behind the paywalls.”

Law firms will need to use some form of AI in order to be competitive, said Nainesh Kotak, a personal injury and long-term disability lawyer based in the Toronto area.

The key is having lawyers review and correct what AI produces, as well as ensuring compliance with privacy and data security rules and professional regulations, he said.

Ultimately, he said, AI is a tool, and it can’t replace legal judgment, ethical obligations and human understanding.

Related Articles

Teen girl injured in tragic incident on school trip, clothes caught in chairlift

Teen girl injured in tragic incident on school trip, clothes caught in chairlift

By favofcanada.caFebruary 13, 2026
Manitoba sees 44 measles cases in first week of February

Manitoba sees 44 measles cases in first week of February

By favofcanada.caFebruary 13, 2026
Saskatchewan’s modern dating world and why people are straying away from apps

Saskatchewan’s modern dating world and why people are straying away from apps

By favofcanada.caFebruary 13, 2026
Saskatchewan Polytechnic backtracks on program move for current students

Saskatchewan Polytechnic backtracks on program move for current students

By favofcanada.caFebruary 13, 2026
Misconduct complaints fell in 2025 against House of Commons members

Misconduct complaints fell in 2025 against House of Commons members

By favofcanada.caFebruary 13, 2026
Elections Canada says Freeland broke rule by answering byelection questions

Elections Canada says Freeland broke rule by answering byelection questions

By favofcanada.caFebruary 13, 2026
Add A Comment

Leave A Reply Cancel Reply

Don't Miss
Canada tops Switzerland in Olympic men’s hockey

Canada tops Switzerland in Olympic men’s hockey

By favofcanada.caFebruary 13, 2026

MILAN – Jon Cooper felt his team sagging. Despite leading 2-1 late in the first…

Manitoba sees 44 measles cases in first week of February

Manitoba sees 44 measles cases in first week of February

February 13, 2026
As Malinin implodes, Canada’s Gogolev finishes 5th

As Malinin implodes, Canada’s Gogolev finishes 5th

February 13, 2026
Saskatchewan’s modern dating world and why people are straying away from apps

Saskatchewan’s modern dating world and why people are straying away from apps

February 13, 2026
Stay In Touch
  • Facebook
  • Twitter
  • Pinterest
  • Instagram
  • YouTube
  • Vimeo
Our Picks
5 things to know from Friday at the Winter Games

5 things to know from Friday at the Winter Games

By favofcanada.caFebruary 13, 2026
Saskatchewan Polytechnic backtracks on program move for current students

Saskatchewan Polytechnic backtracks on program move for current students

By favofcanada.caFebruary 13, 2026
Kennedy lets F-bombs fly as curling game gets hot

Kennedy lets F-bombs fly as curling game gets hot

By favofcanada.caFebruary 13, 2026
About Us
About Us

Fav of Canada is your one-stop website for the latest Canada's trends and updates, follow us now to get the news that matters to you.

We're accepting new partnerships right now.

Email Us: [email protected]
Contact: +44 7741 486006

Our Picks
Teen girl injured in tragic incident on school trip, clothes caught in chairlift

Teen girl injured in tragic incident on school trip, clothes caught in chairlift

February 13, 2026
Canada tops Switzerland in Olympic men’s hockey

Canada tops Switzerland in Olympic men’s hockey

February 13, 2026
Manitoba sees 44 measles cases in first week of February

Manitoba sees 44 measles cases in first week of February

February 13, 2026

Subscribe to Updates

Get the latest Canada's trends and updates directly to your inbox.

Facebook X (Twitter) Instagram Pinterest TikTok
  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact
© 2026 Fav of Canada. All Rights Reserved.

Type above and press Enter to search. Press Esc to cancel.