Written By Benjamin Reingold and Stephen Burns
The rise of artificial intelligence (AI) has revolutionized industries, including the legal field. However, with its benefits come challenges, one of which is "AI hallucinations", which refer to instances where an AI system generates false, misleading or fabricated information that appears credible. These hallucinations have begun to surface in Canadian legal proceedings, including in materials submitted to the Federal Court of Canada.
This blog is a timely update to our recent blog, Requirements and Guidelines From Canadian Regulators, Public Bodies and Courts for the Use of Artificial Intelligence.
Guidance from the Federal Court
As described in our prior blog, the Federal Court issued guidelines on May 7, 2024, requiring parties to inform the Court and each other if documents submitted to the Federal Court for the purpose of litigation include content created or generated by AI.
The Court thereafter issued Amended Consolidated Practice Guidelines on June 20, 2025, which refers back to the May 7, 2024, guidelines, and cautioned parties that "[f]ailure to comply with the Notice may result in consequences for parties and/or counsel, including the imposition of an adverse cost award or an order to show cause why the party or counsel in question should not be held in contempt." Such consequences have manifested themselves in recent case law.
Hallucinations in Federal Court Jurisprudence
In Hussein v Canada, 2025 FC 1060, the Federal Court criticized the use of hallucinations in submissions received from the Applicant's counsel. The problems were revealed when the Court could not locate several authorities cited in the Applicant's materials and issued several directions requiring the Applicant to furnish the Court with a Book of Authorities.
Thereafter, Applicant's counsel revealed that it used Visto.ai, a research tool designed for Canadian immigration and refugee law practitioners. Associate Judge Moore found that the AI tool hallucinated two non-existent cases and mis-cited an existing case for a proposition that it did not stand for.
The Court recognized that the "use of generative artificial intelligence is increasingly common and a perfectly valid tool for counsel to use" but, after citing the May 7, 2024, guidelines, found that "its use must be declared and as a matter of both practice, good sense and professionalism, its output must be verified by a human."
In closing, Associate Judge Moore stated that "counsel’s reliance on artificial intelligence was not revealed until after the issuance of four Directions" which "amounts to an attempt to mislead the Court", and "that consideration should be given as to whether it would be appropriate to direct Applicant's counsel to pay any costs awarded on the motion personally".
In a follow-up decision, Associate Judge Moore ordered costs personally against Applicant's counsel in the amount of $100 (Hussein v Canada, 2025 FC 1138).
In another case, the Court ordered that a self-represented litigant's motion record be removed from the Court file for citing a fake case and that this sanction was "necessary to preserve the integrity of the Court’s process and the administration of justice." (Lloyd's Register Canada Ltd v Munchang Choi, 2025 FC 1233)
Takeaways
The Hussein decision reports that a law-specific AI tool has hallucinated case law. This finding, and that costs were issued against the Applicant's counsel personally, underscores the importance of having a "human in the loop" to carefully check AI tool output.
Organizations should remain vigilant and mitigate the risk of relying on generative AI by: (i) ensuring all output is reviewed for accuracy; and (ii) implementing robust governance and risk management relating to the use of AI.
If you have any questions about how your organization may use and implement AI, we invite you to contact one of the authors of this article.
Please note that this publication presents an overview of notable legal trends and related updates. It is intended for informational purposes and not as a replacement for detailed legal advice. If you need guidance tailored to your specific circumstances, please contact one of the authors to explore how we can help you navigate your legal needs.
For permission to republish this or any other publication, contact Amrita Kochhar at kochhara@bennettjones.com.