Ad/Marketing Communication
This legal article/report forms part of my ongoing legal commentary on the use of artificial intelligence within the justice system. It supports my work in teaching, lecturing, and writing about AI and the law and is published to promote my practice. Not legal advice. Not Direct/Public Access. All instructions via clerks at Doughty Street Chambers. This legal article concerns AI Law.

Introduction and Fake Cases
Court users’ reliance on AI-generated “hallucinations” or, fake AI cases, is frequently reported and tracked on this site. I am also trying to track governmental use, AI bias/discrimination, and judicial reliance on AI in courtrooms. Given the increased use of generative AI, it was perhaps inevitable that a significant error involving fake cases would eventually influence a judicial order. For this reason, the case of Shahid v Esaam (2025) merits close attention. The decision can be read on FindLaw here and my summary is below.
Background of Shahid v Esaam (2025)
The case began conventionally enough. The Wife petitioned to reopen her divorce proceedings and set aside the final judgment, arguing that service by publication was improper. She claimed that her former Husband had not exercised proper diligence to locate her before resorting to service by publication. Initially, the trial court denied Wife’s petition to reopen the case.
Discovery of Fake Cases on Appeal
However, significant allegations emerged regarding the trial court’s decision, as its order relied upon fictitious case law:
“Wife points out in her brief that the trial court relied on two fictitious cases in its order denying her petition, and she argues that the order is therefore, “void on its face.”… Specifically…[husband’s attorney] … cited four cases: two were fictitious, likely generated by artificial intelligence (“AI hallucinations”), and the other two were irrelevant to the argument presented.”
The Husband did not address this issue in his brief; instead, he compounded the problem:
“Undeterred by Wife’s argument that the order (which appears to have been prepared by Husband’s attorney, …) is “void on its face” because it relies on two non-existent cases, Husband cites to 11 additional cites in response that are either hallucinated or have nothing to do with the propositions for which they are cited. Appellee’s Brief further adds insult to injury by requesting “Attorney’s Fees on Appeal” and supports this “request” with one of the new hallucinated cases.”
How the Trial Court Relied on Fake Cases
The appellate court then discussed how the trial court came to rely on these problematic citations, without independently verifying their authenticity:
“We are troubled by the citation of bogus cases in the trial court’s order. As the reviewing court, we make no findings of fact as to how this impropriety occurred, observing only that the order purports to have been prepared by Husband’s attorney, …. We further note that Lynch had cited the two fictitious cases that made it into the trial court’s order in Husband’s response to the petition to reopen, and she cited additional fake cases both in that Response and in the Appellee’s Brief filed in this Court.”
Appellate Court’s Decision and Penalty
The Court of Appeals concluded by vacating the trial court’s flawed order and remanding the case for a fresh hearing. Due to reliance on fictitious, case law, the appellate court determined it could not properly review the original decision. Moreover, recognising the severity of the misconduct, the appellate court imposed a $2,500 penalty against the Husband’s attorney, highlighting the seriousness of using fabricated legal citations. The superior court was specifically directed to conduct a new hearing on Wife’s petition, giving her the opportunity to fairly present her case without reliance on non-existent authorities.
Comment
I note that the appellate court did impose a penalty on the attorney, but didn’t appear to take a similar approach, at least in this particular judgment, towards the trial judge who had relied on fictitious cases in a court order. It’s hard to say whether other appellate courts would respond in the same way, but I do wonder whether we’ll start to see more of these issues appearing in judicial decisions, especially where judges don’t have the benefit of legal submissions to assist with verification.
These kinds of cases raise some tricky questions. Should judges be held to the same standard as lawyers when it comes to checking sources? Will the growing use of AI in judgments lead to new expectations, or even new types of oversight? I’m not sure, but they’re questions I find myself coming back to as I keep an eye on how courts around the world are starting to engage with these tools.
I have added this to my Judicial AI Use Tracker which seeks to track how Judges are using AI globally, so please do continue to send me any cases where you learn of how AI is being used by the judiciary.
This article is part of my broader legal commentary available through my Substack newsletter.
Subscribing ensures you receive immediate updates, in-depth analysis, and exclusive legal insights as they are published.
➔ [Subscribe here to stay informed].
Editor’s Note (8 July 2025):
The title and wording in this post were updated to clarify that the court did not find the fictitious cases were AI-generated, but only acknowledged the possibility. I’ll leave those reading the full judgment to arrive at their own view.
