AI Hallucination Cases Tracker (AI and non-AI fabricated/false citations)

AI hallucination cases can, and do, reach real courtrooms and have real consequences. However, there is some dispute about the correct phrasing of this phenomenon:

“…Although the termed used in relation to erroneously generated references by Al is "hallucinations", this is a term which seeks to legitimise the use of Al. More properly, such erroneously generated references are simply fabricated, fictional, false, fake and as such could be misleading...”
JML Rose Pty Ltd v Jorgensen (No 3) [2025] FCA 976 (Federal Court of Australia, 19 August 2025)

AI Hallucination Cases Tracker

Ad/Marketing Communication

UK‑based legal commentary and comparative analysis of international case law on AI related legal issues. This AI Hallucination Cases Tracker (AI and non-AI fabricated/false citations) forms part of lecturing/teaching law and writing/editing law articles/reports and is communicated solely in connection with promoting or advertising Matthew Lee’s practice. Not legal advice. Not Direct/Public Access. All instructions via clerks at Doughty Street Chambers.

19 January 2026 Update on AI Hallucination Cases Tracker (AI and non-AI fabricated/false citations)

Thank you for your patience while I update the AI Hallucinations Cases Tracker (AI and non-AI fabricated/false citations).
The latest entries and analyses are being added shortly. In the meantime, enter your email below to be notified when the update is complete and to receive regular insights and case summaries.

What you can see

You will see two things below: the AI Hallucination Cases Tracker (AI and non-AI fabricated/false citations) and a set of Charts/Diagrams. The tracker lists the case reference (neutral citation), a link to the public judgment, and a short extract from that judgment.

What the Charts Show

The charts below give a rough picture of the data I’m currently analysing on AI and non-AI fabricated/false citations. They offer early insights into some of the key questions I’m exploring, including:

  • Did the judge cite the fabricated or false citation within the judgment?
  • What types of AI and non-AI fabricated/false citations are finding their way into court?
  • Are these AI and non-AI fabricated/false citations referring to case law, legislation, or something else?
  • Who is discovering the AI and non-AI fabricated/false citations?
  • Is this mainly occurring with litigants in person (pro se) or with lawyers?
  • What reasons are given for the AI and non-AI fabricated/false citations?
  • What features are aggravating or mitigating?

What’s in the Vault

These visuals are powered by my private research, which is not included in the publicly available AI Hallucination Cases Tracker (AI and non-AI fabricated/false citations) below. This private research contains deeper analysis, richer tagging, broader categories, detailed timelines and considerable detail on fabricated/false citations.

I do not publish the underlying databases or research publicly. If you wish to discuss access to my private research, please contact my clerks (details above)

Important Note

This tracker is part of my ongoing research. It takes time to build and is updated frequently as I work through new judgments and revisit the historic. For further analysis of these cases, I would suggest reading my regular commentary on AI Hallucinations (AI and non-AI fabricated/false citations) cases globally. The latest is available [here]. Most cases are from the USA, but this is an international issue.

The aim of this AI Hallucination Cases Tracker (AI and non-AI fabricated/false citations) is to gather broad data about how hallucinations, fabricated citations, whether by AI or otherwise, proven or just alleged, are appearing in courts and reports. As set out above, I am interested in questions such as: Why is this happening? What types of hallucinations are occurring? What reasons are given? What mitigating or aggravating features are raised? And whether judges themselves cite these errors in their authentic judgments? I want to understand the broader international picture to advance AI law, an important area of my legal practice in England and Wales.

Judgments do not all describe these issues in the same way, and they do not always fit neatly within a label. To spot patterns and make study easier, I group what is recorded, advanced, or alleged into broad categories. This means the tracker will not always match the exact wording of the judgment and it may be incomplete or contain errors. For the detail in any particular case, always check the official judgment via the hyperlink. If the hyperlink takes you to one of my legal articles on this blog, you will be able to access the judgment from that legal article.

I also use AI tools and other online resources to help organise and maintain this AI Hallucination Cases Tracker (AI and non-AI fabricated/false citations). For that reason, and because the task involves categorising complex material, you should always verify details against the judgment itself. If you notice anything that looks inaccurate, please let me know and I will review and update.

How To Read the Columns and the Charts

Actor
Based on the judgment, I record whether it was a lawyer, a litigant in person, or another participant connected with the alleged error. Different judgments use different wording, so I seek to place them into consistent groups to make the data comparable.

Hallucination Type
I have grouped the most common types of AI and non-AI fabricated/false citations into eight categories, explained in detail in a separate article. Where something does not fit perfectly, I place it in the closest type so it can be counted and compared across judgments.

Reasons
This records reasons either found by the court or advanced by a party. They are grouped into broad categories such as “time pressure” or “lack of training” so that different explanations can be studied together. Always check the judgment for the precise detail, as reasons are often more complex than the summary category suggests.

Mitigation
This includes both mitigating factors the court accepted and those advanced by a party, such as “acted honestly” or “tried to put it right”. They are grouped so we can see the range of issues raised across cases and compare how courts respond. Specific mitigation is always more nuanced than the summary (always check the judgment for the full detail).

Aggravation
This includes aggravating factors noted by the court or advanced in submissions, such as “repeated incident” or “ignoring guidance”. They are grouped into categories for comparison. Again, for the exact findings, always refer to the judgment.

Cited Fake Case
A simple yes/no marker showing whether the judge cited a hallucinated case/fabricated citation or principle within the judgment itself. Why this issue matters is explained here.

Research Upload in Progress – Thank You for Bearing with Me!

This tracker is being updated in real time, so some parts may look unfinished for a few days, so please frequently check in. There’s also plenty to explore in the meantime:

  • The original tracker: still live and available below!
  • What are AI Hallucination Cases and the 8 Types: if you’d like to understand what a legal hallucination actually is and the different ways it appears, take a look at the 8 types I’ve identified in this article. These will form the backbone of the upgraded tracker.
  • Weekly write-ups (International): every week I summarise the most recent AI hallucination cases. The latest is available [here]. Most case are from the USA, but this is an international issue.
  • The UK Position: the UK incidents are reported separately here.
  • Popular Posts: read all about this project and popular posts/AI trackers here.
  • Check out the FAQ below for common Questions
  • Subscribe: I send regular updates from Substack here direct to your email (It’s FREE). Be the first to receive important updates in AI law.

Thank you for your patience – the full launch is coming very soon. In the meantime, dive into the existing resources and see how the field is already evolving.

Key Stats Line & Charts

Total entries: [TBC] Last updated: [22 November 2025].

AI Hallucination Cases Tracker Under Construction

DateCaseJudicial Quotes
May 2023TBC (Lawgazette note)The Gazette understands that the following day, the LiP returned to court with four case citations, each backing the point they were trying to make. On closer inspection by the barrister, it transpired that one case name had simply been fabricated, while the other three were real case names but with the relevant cited passages being completely different to the judgment in each. For all four citations, the paragraphs quoted were completely fictitious, though appearing completely legitimate.
June 2023Scott v Federal National Mortgage Ass’n (Me. Supe"The Court is aware of recent incidents in the legal community involving filingsgenerated in whole or in part by artificial intelligence, such as ChatGPT, that incorporatecase citations and quotations which do not, in fact, exist. In the Court's view, blindreliance on artificial intelligence does not excuse misrepresentation of the law to theCourt. Although this is especially true for attorneys, who certainly ought to know betterthan to submit a filing without verifying citations, pro se litigants must be held to thesame standard"
June 2023Mata v Avianca 22-cv-1461 (PKC)"Many harms flow from the submission of fake opinions. The opposing party wastes time and money in exposing the deception. The Court's time is taken from other important endeavors. The client may be deprived of arguments based on authentic judicial precedents. There is potential harm to the reputation of judges and courts whose names are falsely invoked as authors of the bogus opinions and to the reputation of a party attributed with fictional conduct. It promotes cynicism about the legal profession and the…judicial system. And a future litigant may be tempted to defy a judicial ruling by disingenuously claiming doubt about its authenticity."
June 2023Parker v Forsyth N.O."The Plaintiff's attorneys used this artificial intelligence medium toconduct legal research and accepted the results that it generatedwithout satisfying themselves as to its accuracy. As it turned out, thecases listed above do not exist. The names and citations are fictitious,the facts are fictitious, and the decisions are fictitious. The Plaintiff'scounsel was constrained to concede as much. " [87]A Chaitram Regional Magistrate
July 2023Ex parte Lee
September 2023Ruggirello v. LancasterThese and other fabrications within [name's] objections may be from [name's] imagination, a generative artificial intelligence tool's hallucination, both, or something else entirely. The Court need not speculate.
October 2023Thomas v Pangburn 2023 WL 9425765
October 2023Morgan v. Community Against Violence"Although courts “make some allowances for the pro se Plaintiff’s failure to cite to proper legal authority,” courts do not make allowances for a Plaintiff who cites to fake, nonexistent, misleading authorities"
November 2023re Celsius Network LLCThere were no standards controlling the operation of the artificial intelligence that generated the Report. The Report contained numerous errors, ranging from duplicated paragraphs to mistakes in its description of the trading window selected for evaluation
November 2023Mescall v. Renaissance at AntiquityUse of artificial intelligence to write pleadings is a novel issue, and appears to be untread territory in the Fourth Circuit. However, recent caselaw from outside of this jurisdiction supports the common-sense conclusion that the use of artificial intelligence creates challenges, raises ethical issues, and may result in sanctions or penalties when used inappropriately
November 2023Whaley v. Experian Information Solutions, Inc.Plaintiff admits that he used Artificial Intelligence (“AI”) to prepare case filings....The Court reminds all parties that they are not allowed to use AI—for any purpose—to prepare any filings in the instant case or any case before the undersigned. ...Both parties, and their respective counsel, have an obligation to immediately inform the Court if they discover that a party has used AI to prepare any filing. Id. The penalty for violating this provision includes, inter alia, striking the pleading from the record, the imposition of economic sanctions or contempt, and dismissal of the lawsuit. Id.
November 2023People v Crabill
December 2023Harber v HMRC“But that does not mean that citing invented judgments is harmless. It causes the Tribunal and HMRC to waste time and public money, and this reduces the resources available to progress the cases of other court users who are waiting for their appeals to be determined. As Judge Kastel said, the practice also "promotes cynicism" about judicial precedents, and this is important, because the use of precedent is "a cornerstone of our legal system" and "an indispensable foundation upon which to decide what is the law and its application to individual cases"
January 2024Vanguard Construction & Development Co. v 400 Times Square Associates, LLC (2025).
January 2024Will of Samuel
January 2024Park v Kim
February 2024Zhang v ChenMasuhara J. held:"...Citing fake cases in court filings and other materials handed up to the court is an abuse of process and is tantamount to making a false statement to the court. Unchecked, it can lead to a miscarriage of justice..."
February 2024Kruse v Karlen
February 2024Moffatt v Air Canada
February 2024Smith v Farwell
March 2024Grant v City of Long Beach 96 F.4th 1255
March 2024US v Cohen
April 2024Plumbers & Gasfitters Union v. Morris PlumbingThe citation goes to a case of a different name, from a different year, and from a different circuit. Court staff also could not locate the case by searching, either on Google or in legal databases, the case name provided in conjunction with the purported publication year. If this is, as the Court suspects, an instance of provision of falsified case authority derived from artificial intelligence, Plaintiffs' counsel is on notice that any future instance of the presentation of nonexistent case authority will result in sanctions.
May 2024Dowlah v. Professional Staff Congress"Plaintiff cites several nonexistent cases in his initial memorandum of law. In his reply brief he acknowledges that these citations were the result of research using "legal software applications" that deploy artificial intelligence. Plaintiff avers that he has an LLM (among other advanced degrees) but not much "legal expertise" and he apologizes for the fictitious precedents. We caution plaintiff that his pro se status does not excuse"
June 2024X BV in Z v. Tax InspectorTBC - Translation Required
July 2024Zeng v. Chell"It is unclear whether this is another instance of the use of artificial intelligence. Were it not for the plaintiff's pro se status and the proper judgment dismissing this case, a further inquirity would be appropriate"
July 2024Lakaev v McConkey"When artificial intelligence is used to generate submissions for use in court proceedings, there is a risk that the submissions that are produced will be affected by a phenomenon known as "hallucination".
July 2024Anonymous v. New York City Department of Educationone last matter remains for the Court to address. Defendants note that, at times, Plaintiff “cit[es] to and reli[es] on what appears to be non-existent legal authority.” Remand Opp. at 9. Having reviewed the case citations flagged by Defendants, the Court is likewise unable to locate them. Without question, it is improper and unacceptable for litigants – including pro se litigants – to submit “non-existent judicial opinions with fake quotes and citations.
July 2024Handa v Mallick
July 2024Byrd v. The Villages of Woodland Springs Homeowners Association, Inc.“We cannot tell from ... brief if he used ChatGPT or another artificial intelligence (AI) source to attempt to develop his legal citations.SeeDavid T. Laton, A Cautionary Tale of AI As A Research Tool for Lawyers, Prac. Law. 42, 43 (2024) (“ChatGPT currently lacks the ability to produce reliable and accurate results when given a legal query.”).
August 2024Industria de Diseño Textil, S.A. v. Sara GhassaiWhether accidental or deliberate, reliance on false citations is a serious matter [see Zhang v Chen, 2024 BCSC 285]. In the event the submissions resulted in whole or in part from reliance on some form of generative artificial intelligence, the Applicant is reminded of the importance of verifying the final work product prior to its submission to the Registrar
August 2024N.E.W. Credit Union v. Mehlhorn"In its brief, ... points out that the cases cited by ...do not exist and speculates that Mehlhorn used an artificial intelligence program to draft her brief-in-chief. In her reply brief, ... does not respond to this assertion. Instead, she cites eight new cases, none of which were referenced in her brief-in-chief. It appears, however, that four of those cases are also fictitious. At a minimum, this court cannot locate those cases using the citations provided."
August 2024Dayal
August 2024Dukuray v. Experian Information Solutions“The Court recognizes it is possible Plaintiff is not aware of the risk that ChatGPT and similar AI programs are capable of generating fake case citations and other misstatements of law. The Court also recognizes that it may be more difficult for a pro se litigant without access to computerized legal databases such as Westlaw or LEXIS to check the veracity of case citations generated by AI programs. Defendants have not sought sanctions against Plaintiff, and the Court does not believe any sanctions would be appropriate. Nevertheless, it is no more acceptable for a pro se litigant to submit briefs with fake case citations than it is for a lawyer to do so.”
September 2024Rule v. BraimanIn any future filings Plaintiff must include a full citation of any case cited. The Court will not consider any case that is only cited with the parties' names and the date of the case; every case cite must include a citation to the reporter or online legal service, like Westlaw or LexisNexis, or the court and court case number and docket number where the case may be found. The Court notes that “ChatGPT and similar AI programs are capable of generating fake case citations and other misstatements of law.”
September 2024Sala Primera del Tribunal Constitucional – Nota Informativa 90/2024
September 2024Mortazavi v Hamilton
October 2024Coulston & Ors v Elliott & Anor [2024] IEHC 697“When I asked the First Named Defendant where this argument came from, he confirmed that he had asked a friend to prepare the submissions. This has caused the court significant disruption, since I made it clear that no new issue should be raised in his submissions. When I asked him to explain the submissions, he said that he could not. It seems to me therefore that it is highly likely that one of two things happened, either the Defendants went to somebody who purported to be a lawyer or they used a generative AI program to generate the submissions.” (Paragraph 86) “… if they used a generative AI program, they have been fooled. Such programs often sound persuasive but can be fatally flawed. Sadly, in this case the argument was indeed fatally flawed. The general public should be warned against the use of generative AI devices and programs in matters of law…” (Paragraph 87)
November 2024Gauthier v Goodyear Tire & Rubber
December 2024Crypto Open Patent Alliance v Dr. Craig Steven Wright“…referred to a series of authorities in support of arguments that reasonable adjustments should be made to enable a vulnerable litigant or witness to participate fairly in court proceedings. As COPA pointed High Court Approved Judgment COPA v Wright Contempt CMC Page 6 out by reference to a series of examples, most of the authorities he has cited do not contain the passages attributed to them (or anything like those passages), and indeed most have nothing to do with adjustments for vulnerable witnesses. COPA suggested that it seems likely that they are AI “hallucinations” by ChatGPT (i.e. made-up references) rather than deliberately misleading inventions by Dr Wright. However, since the principles are clear and not in doubt, as set out above, it is not necessary to engage with his false citations any further.”
December 2024Al-Hamim v Star Hearthstone 2024COA128
January 2025Ms (Bangladesh) v SoS for Home Department13. We sought clarification regarding this citation and reference and asked for the relevant paragraph of the judgment being relied on. [counsel] was not able to specify this. [counsel] submitted that he understood, having used ChatGBT, that the Court of Appeal in Y (China) [2010] EWCA Civ 116 was presided by Pill LJ, LJ Sullivan LJ and Sir Paul Kennedy. However, the citation [2010] EWCA Civ 116 did not point to the case of Y (China) but to R (on the application of YH) v SSHD. We raised concern about this and referred [counsel] to the recent decision of the President of King’s Bench Division in Ayinde [2025] EWHC 1383 (Admin) on the use of Artificial Intelligence and fictitious cases, and directed him to make separate representations in writing.14. In his subsequent written representations, [counsel] clarified that Y(China) was a typological error and he sought to rely on R (on the application of YH) v SSHD [2010] EWCA Civ 116 where, when discussing the meaning of ‘anxious scrutiny’ in asylum claims…”
January 2025Mavundla v MEC[90] In this age of instant gratification, this incident serves as a timely reminder to, at least, the lawyers involved in this matter that when it comes to legal research, the efficiency of modem technology still needs to be infused with a dose of good old-fashioned independent reading. Courts expect lawyers to bring a legally-independent and questioning mind to bear on, especially, novel legal matters, and certainly not to merely repeat in parrot-fashion, the unverified research of a chatbot.
January 2025Oliveira v Ryanair DAC [ADJ-00055225] The wording of the submission would suggest that it was prepared by a representative of the Complainant but closer examination revealed that this was not the case. In his responding submission the Complainant described as “baseless” the Respondent position that the submission “may have been generated with the assistance of artificial intelligence (AI)”. However, on day 2 of hearing the Complainant acknowledged that he may have used AI and became defensive about his use. While I’m not particularly concerned about whether the Complainant used AI or not I am clear that parties making submissions to the WRC have an obligation to ensure that their submissions are relevant and accurate and do not set out to mislead either the other party or the Adjudication Officer. These submissions were rife with citations that were not relevant, mis-quoted and in many instances, non-existent. The Complainant wasted a considerable amount of time of the Respondent and the Adjudication Officer in seeking to establish the veracity or otherwise of legal citations.
January 2025Kohls and Franson v Ellison“[Expert] included citations to two non-existent academic articles and incorrectly cited the authors of a third….admits that he used GPT-4o to assist in drafting…failed to discern that GPT-4o generated fake citations to academic articles…. The irony. [expert] a credentialed expert on the dangers of AI and misinformation, has fallen victim to the siren call of relying too heavily on AI—in a case that at revolves around the dangers of AI, no less…..”
January 2025United States v Hayes
January 2025re Thomas G. Neusom
January 2025Olsen v Finansiel Stabilitet"I have narrowly and somewhat reluctantly come to the conclusion that I shouldnot cause a summons for contempt of court to be issued to the appellants underCPR rule 81.6. I do not think it likely that a judge (whether myself or anotherjudge) could be sure, to the criminal standard of proof, that the appellants knewthe case summary was a fake. They may have known but they could not becompelled to answer questions about the identity of the person who supplied it."Mr Justice Kerr
January 2025Valu v Minister for Immigration
February 2025Saxena v Martínez-Hernández 2025 WL 1194003
February 2025Unnamed appeal (TJ Santa Catarina, Boletim-TJSC
February 2025Wadsworth v Walmart Inc
February 2025Bunce v Visual Tech. Innovations, Inc
March 2025Reddan v An Bord Pleanála [2025] IEHC 172“…When challenged, he says it was something he discovered during his online research. This sounds like something that derived from an artificial intelligence source. It has all the hallmarks of ChatGPT, or some similar AI tool…” Paragraph 78 “This ground looks as if it is a cut and paste from some form of pseudo legal online text. The fact that it contains his name suggests to me that it is being generated by a third party or, as I indicated above, by some form of artificial intelligence tool”
March 2025Williams v Capital One"It is not acceptable for parties to submit "filings to the Court containing citations to legalauthority that does not exist, whether drafted with the assistance of artificial intelligence or not. "
March 2025Nguyen v Savage Enterprises
April 2025Dehghani v. Castro
April 2025Bandla v SRA“I asked the Appellant why, in the light of this citation of non-existent authorities, the Court should not of its own motion strike out the grounds of appeal in this case, as being an abuse of the process of the Court. His answer was as follows. He claimed that the substance of the points which were being put forward in the grounds of appeal were sound, even if the authority which was being cited for those points did not exist. He was saying, on that basis, that the citation of non-existent (fake) authorities would not be a sufficient basis to concern the Court, at least to the extent of taking that course. I was wholly unpersuaded by that answer. In my judgment, the Court needs to take decisive action to protect the integrity of its processes against any citation of fake authority. There have been multiple examples of fake authorities cited by the Appellant to the Court, in these proceedings. They are non-existent cases. Here, moreover, they have been put forward by someone who was previously a practising solicitor. The citations were included, and maintained, in formal documents before the Court. They were never withdrawn. They were never explained. That, notwithstanding that they were pointed out by the SRA, well ahead of this hearing. This, in my judgment, constitutes a set of circumstances in which I should exercise – and so I will exercise – the power of the Court to strike out the grounds of appeal in this case as an abuse of process.”
April 2025ZZaman v Revenue & Customs29. However, our conclusion was that Mr Zzaman's statement of case, written with the assistance of AI, did not provide grounds for allowing his appeal. Although some of the case citations in Mr Zzaman's statement were inaccurate, the use of AI did not appear to have led to the citing of fictitious cases (in contrast to what had happened in Felicity Harber v HMRC [2023] UKFTT 1007 (TC) ). But our conclusion was that the cases cited did not provide authority for the propositions that were advanced. This highlights the dangers of reliance on AI tools without human checks to confirm that assertions the tool is generating are accurate. Litigants using AI tools for legal research would be well advised to check carefully what it produces and any authorities that are referenced. These tools may not have access to the authorities required to produce an accurate answer, may not fully "understand" what is being asked or may miss relevant materials. When this happens, AI tools may produce an answer that seems plausible, but which is not accurate. These tools may create fake authorities (as seemed to be the case in Harber ) or use the names of cases to which it does have access but which are not relevant to the answer being sought (as was the case in this appeal). There is no reliable way to stop this, but the dangers can be reduced by the use of clear prompts, asking the tool to cite specific paragraphs of authorities (so that it is easy to check if the paragraphs support the argument advanced), checking to see the tool has access to live internet data, asking the tool not to provide an answer if it is not sure and asking the tool for information on the shortcomings of the case being advanced. Otherwise there is a significant danger that the use of an AI tool may lead to material being put before the court that serves no one well, since it raises the expectations of litigants and wastes the court's time and that of opposing parties.
April 2025Goshen v Accuro (2304373/2024)
"...I cannot find such a case, and I am left wondering whether this case is aninvention by the claimant or perhaps an artificial intelligence platform. As I explainedin the hearing, I cannot apply authority which I have not seen. "
April 2025A County Court case refered to at para 55 of the Ayinde v LBB judgment before HHJ Holmes“That was a case before the County Court … That counsel drew attention to the fact that the application before the judge contained false material: specifically the grounds of appeal and the skeleton argument settled … contained references to a number of cases that do not exist….”
April 2025Bevins v Colgate-Palmolive Co
April 2025Benjamin v Costco Wholesale Corp
April 2025Willis v Bank National Association“[b]y presenting to the court a pleading, written motion, or other paper – whether by signing, filing, submitting, or later advocating it – an attorney or unrepresented party certifies that to the best of the person's knowledge, information, and belief, formed after an inquiry reasonable under the circumstances ․ the claims, defenses, and other legal contentions are warranted by existing law...”“...[c]onfirming a case is good law is a basic, routine matter and something to be expected from a practicing attorney...”
April 2025Nexgen Pathology Services Ltd v Darceuil Duncan69. The Court acknowledges that digital tools including AI and internet-based platforms,are increasingly common and valuable in legal research; indeed, this Court itself makesuse of such tools where appropriate. However, their use must be accompanied bydiscernment and subjected to rigorous verification. This is because AI-generatedcontent is susceptible to producing what are commonly referred to as “hallucinations”:fabricated, yet plausible-sounding outputs that may result from gaps or limitations inthe model’s underlying data. Legal practitioners must not rely on such tools uncritically. Any information obtained through these means must be independentlyverified before being presented to the Court.70. The Court emphasizes that citing non-existent cases, even inadvertently, constitutes aserious abuse of process and professionalism. It risks misleading the Court, prejudicingthe opposing party, and eroding public confidence in the administration of justice. Counsel are reminded that the duty of candour to the Court requires that they verifythe authenticity of every case cited. If any material has been generated with theassistance of AI or other non-traditional sources, full disclosure to the Court is bothappropriate and expected."The Hon. Mr. Justice Westmin R.A. James
May 2025Lacey v State Farm General Ins
May 2025Reclamação 78.890TBC – awaiting verified English translation of the judgment.
May 2025Ramirez v Humala
May 2025Versant Funding LLC v Teras Breakbulk Ocean"In the Court's view, there is nothing inherently wrong with an attorney properly and competently utilizing AI or any of its subsets to practice law or litigate cases. [But a] basic prerequisite to the filing of any ... paper in court is for the drafting and filing attorney(s) to carefully check every case citation, fact, and argument to make sure that they are correct and proper. Attorneys cannot delegate that role to AI, computers, robots, or any other form of technology."
May 2025Concord MG v Anthropic PBC"TheCourt gave Anthropic time to investigate the circumstances surrounding the challenged citation....the Court finds this issue is a serious one—if not quite so grave as it at first appeared.Anthropic’s counsel protests that this was “an honest citation mistake” but admits thatClaude.ai was used to “properly format” at least three citations and, in doing so, generated afictitious article name with inaccurate authors (who have never worked together) for the citation atissue... That is a plain and simple AI hallucination. Yet the underlying articleexists, was properly linked to and was located by a human being using Google search; so, this isnot a case where “attorneys and experts [have] abdicate[d] their independent judgment and criticalthinking skills in favor of ready-made, AI-generated answers….”"...A remaining serious concern, however, is Anthropic’s attestation that a “manual citation check”was performed but “did not catch th[e] error.”
May 2025Mid Central Operating Engineers HWF v Hoosiervac LLC“That said, in considering an appropriate sanction the Court takes into account the steps [Lawyer] has taken "to educate himself on the responsible use of AI in legal practice" and adhere to "the highest standards of professional conduct moving forward." See dkt. 102 at 2. The Court also considers the collateral consequences that [Lawyer] has experienced, and may continue to experience, from having improperly relied on non-existent AI-generated legal citations…so, the court has considered those circumstances alongside its interest in deterring careless or reckless attorney conduct.”
May 2025Ko v Li[14] This occurrence seems similar to cases in which people have had factums drafted by generative artificial intelligence applications (like ChatGPT). Some of these applications have been found to sometimes create fake legal citations that have been dubbed “hallucinations.” It appears that Ms. Lee’s factum may have been created by AI and that before filing the factum and relying on it in court, she might not have checked to make sure the cases were real or supported the propositions of law which she submitted to the court in writing and then again orally.FL Myers J
May 2025Mid Central Operating Engineers
June 2025Alharoun v Qatar National Bank and QNB"In CL-2024-000435, it appears from the Order of Mrs Justice Dias that correspondence was sent to the court, and witness statements were filed, citing authorities that do not exist and claiming that other authorities contained passages that they do not contain" Rt Hon. Dame Victoria Sharp
June 2025R (Ayinde) v Haringey“ It is such a professional shame. The submission was a good one. The medical evidence was strong. The ground was potentially good. Why put a fake case in?”“I should say it is the responsibility of the legal team, including the solicitors, to see that the statement of facts and grounds are correct.”“…I consider that it would have been negligent for this barrister, if she used AI and did not check it, to put that text into her pleading.”Mr Justice Richie
June 2025UB v SoS for Home Department“…recognised this seriousness of this issue and has taken commendable steps to ensure it will not be repeated including (i) meeting with the caseworker who drafted the Grounds; (ii) holding a partners’ meeting to discuss adopting an AI policy and assigning the task of finalising an AI policy to a colleague in consultation with an AI professional; (iii) conducting relevant in-house training and issuing interim AI Guidance and (iv) planning for comprehensive staff training by an AI professional….”
June 2025Pro Health Solutions Ltd v ProHealth Inc (UKIPO, Appointed Person, BL O/0559/25)"As identified in Ayinde (including in the Appendix setting out domestic and overseas examples of attempts to rely on fake citations), fabrication of citations can involve making up a case entirely, making up quotes and attributing them to a real case, and also making up a legal proposition and attributing it to a real case even though the case is not relevant to the legal proposition being made (for instance, it deals with a completely different issue or area of law). It is not, however, fabrication to make an honest mistake as to what a court held in a particular case or to be genuinely mistaken as to the effect of a court’s judgment. In any event, it does not matter whether fabrication was arrived at with or without the aid of generative artificial intelligence. I therefore need to consider what if any sanction is appropriate.”
June 2025Malone & McEvoy v Laois County Council et all [2025] IEHC 345“..did not reply. I refrain from inferring how it was generated.42.It is necessary to be clear: it is not acceptable to depict text in written submissions as a verbatim quotation from an authority where it is not such. A similar action by a professional lawyer would be misconduct –see, for example, the recent and somewhat analogous case of Ayinde. The principle is essentially the same -though I hasten to say that I would not push the analogy too far as to a factual comparison of the present case with that case and the error in the present case is not of the order of the misconduct that case. However, appreciable judicial time was wasted on the issue –not least trying to find the source of the quotation. And it does illustrate:The vital importance of precision and accuracy in written submissions. That duty lies on lay litigants as much as on lawyers.That text in submissions formatted so as to convey that it is a direct and verbatim quotation from an identified source must be exactly that. Of course, it is permissible to edit the text (for example to exclude irrelevant content or by underlining for emphasis) but, if so, that it has been done must be apparent on the face of the document.That opposing parties are entitled to written submissions in good time to check them.”
June 2025Shahid v Esaam“We are troubled by the citation of bogus cases in the trial court’s order. As the reviewing court, we make no findings of fact as to how this impropriety occurred, observing only that the order purports to have been prepared by Husband’s attorney, …. We further note that Lynch had cited the two fictitious cases that made it into the trial court’s order in Husband’s response to the petition to reopen, and she cited additional fake cases both in that Response and in the Appellee’s Brief filed in this Court.”
July 2025Various Leaseholders of Napier House v Assethold Ltd “15. The Respondent included two cases within their grounds for appeal which have been cited as…[False Case names] Having performed a search on BAILLI, Westlaw and Find Case Law, it has not been possible to find …[False Case name]. It may be that this case is not authentic and AI may have been used to reference this case….”On another case, the court noted the decision concerned the circumstances in which a parole board should hold an oral hearing. “When reading the full judgment it is difficult to see why the tribunal has been referred to this case…..”
July 2025Coomer v Lindell et al (My Pillow)
July 2025Queen v Kansas 2025 WL 2845025“Plaintiff proceeds pro se and, as explained by Judge James, he does not meet the standards for appointment of counsel. Nonetheless, Plaintiff is subject to the same rules as attorneys who appear before this Court. Given the nonresponsive nature of Plaintiff's responses to these motions to dismiss, the Court is concerned that he may be relying on artificial intelligence to assist him with his many filings in this case. Plaintiff is cautioned against using artificial intelligence for drafting his legal documents or citing cases without confirming their accuracy. Plaintiff is further directed to review Fed. R. Civ. P. 11, which applies to both attorneys and unrepresented parties.”
July 2025HMRC v Gunnarsson [2025] UKUT 247 (TCC).“…In this case, HMRC was put to the trouble of having to investigate the existence of the purported decisions relied upon by the Respondent. Fortunately, they did so. Depending on the circumstances, there may be occasions when the opposing party or the tribunal are not able to discover the errors relied upon. There may be others where an adjournment is required to investigate or address the inaccurate information…”
July 2025Jakes v Youngblood“These quotations are merely representative of the fabricated statements in [B’s] briefs. There are additional fabricated quotations that the Court does not enumerate in this order. In addition to including non-existent quotations in his briefs, Blackburn also cited cases for propositions that they do not represent. The Court will not recite every time [B] misconstrued a case in his briefs as it believes the above quotations represent the most serious and alarming issues with the documents. Attorneys are permitted to make creative case comparisons and may even stretch existing case law to support their arguments. Nevertheless, advocacy is confined by Rule ll(b) and Pa. RPC 3.3. Attorneys have a duty of candor to the Court. They must make reasonable inquiries under the circumstances to ensure their legal contentions are warranted by existing law. Attorneys may not fabricate non-existent quotations, from case law or the Court’s opinion, and may not cite cases for legal propositions for which they do not stand (or even discuss).”
July 2025(BL O/0559/25)"As identified in Ayinde (including in the Appendix setting out domestic and overseas examples of attempts to rely on fake citations), fabrication of citations can involve making up a case entirely, making up quotes and attributing them to a real case, and also making up a legal proposition and attributing it to a real case even though the case is not relevant to the legal proposition being made (for instance, it deals with a completely different issue or area of law). It is not, however, fabrication to make an honest mistake as to what a court held in a particular case or to be genuinely mistaken as to the effect of a court’s judgment. In any event, it does not matter whether fabrication was arrived at with or without the aid of generative artificial intelligence. I therefore need to consider what if any sanction is appropriate.”
July 2025HMRC v Gunnarsson [2025] UKUT 247 (TCC)."113. In this case, HMRC was put to the trouble of having to investigate the existenceof the purported decisions relied upon by the Respondent. Fortunately, they did so.Depending on the circumstances, there may be occasions when the opposing party or24the tribunal are not able to discover the errors relied upon. There may be others wherean adjournment is required to investigate or address the inaccurate information.114. On these facts, we do not consider the Respondent to be highly culpable becausehe is not legally trained or qualified, not subject to the same duties as a regulated lawyeror other professional representative and may not have understood that the informationand submissions presented were not simply unreliable but fictitious. He was under timepressure given his other competing responsibilities and doing his best as a lay litigantseeking to assist the UT by preparing written submissions."
July 2025Father v Mother [2025] EWHC 2135 (Fam)“(16) The F then made a further application on a C2 asking that HHJ Bailey recuse herself on the basis of being biased against him and her not understanding ASD and the impacts of his diagnosis. This came before the Judge on 10 June 2025. In his written application to the court the F referred to a number of previous authorities, in particular relating to ASD. HHJ Bailey realised that many of these cases were not genuine, and the submission appeared to have been generated by Artificial Intelligence (“AI”). In light of the level of recent concern about litigants and lawyers using AI and referring to cases which are not genuine (as reflected in the Divisional Court decision R (Ayinde) v London Borough of Haringey [2025] EWHC 1383), HHJ Bailey referred the case to me as the Family Presiding Judge for the Midlands.”“The F relied upon faked cases without apparently making any effort to check their veracity. It is in my view important to note that the F is someone who is well capable of checking references and ensuring documents are accurate if it is in his interests to do so.”
August 2025Lipe v. Albuquerque Public Schools“Finally, the Court is compelled to note that Plaintiff’s briefing bears signs that counsel continues to neglect to properly review her filings and may be continuing to inappropriately use generative artificial intelligence (“AI”) to draft her filings. Plaintiff’s counsel has already been sanctioned once for the inappropriate use of AI…To be clear, the use of generative AI is not in and of itself problematic, but an attorney must diligently review AI-generated briefing to ensure its accuracy and compliance with Rule 11…”“While Plaintiff’s briefing no longer contains fabricated citations, it does contain several “legal contentions” that are simply not “warranted by existing law…For example, in the legal standard section of the brief, Plaintiff supplies the following rule statement: [STATEMENT REDACTED] Rule 37(b) makes no such statement. Nor is the Court aware of, and Plaintiff does not provide, any case law that supports this proposition.”“Because of the syntactic similarity to Plaintiff’s prior briefing, the Court is concerned that Plaintiff may have used the same generative AI to draft this response brief, but in lieu of verifying the legal contentions, just removed any citations to case law. Without reaching any conclusions as to counsel’s behavior in this instance, the Court once again cautions Plaintiff’s counsel to be more diligent with her briefing.”“Plaintiff provides several other unsupported or inaccurate rule statements. However, because the Court wishes to urge more diligent conduct, rather than further sanction counsel, the Court will not belabor the point.”
August 2025Holloway v Beckles and Beckles "That leaves the matter of the fake cases. The Tribunal finds that this does amount to unreasonable conduct within rule 13(1)(b). It has decided that the misconduct is serious, being conduct that undermines civil litigation in the Tribunal. Therefore, the Tribunal determines that it should "make a costs order. It considers that the costs order should be proportionate to the additional costs caused. It has decided that the appropriate quantum is half the costs of counsel’s fees in attending the hearing of 14 May 2025. These amount to £750 and must be paid to the applicant within 28 days."
August 2025Kuzniar v General Dental Council Case No. 6009997/2024"44. The Claimant explained that the problems arose from her using AI to carry out research.She had previously used AI/ChatGP to carry out research without problems in her litigationagainst Roxdent Ltd and so she expected to be able to do so again successfully in theinstant case. She did not know about the problems with the citations when she told theRespondent’s solicitors about them, and when she found out about them, she did her bestwithin the short time available to mitigate or reduce the problem. She did not act in badfaith or with any intent to place false information before the Tribunal. I accept thisexplanation.45. The Claimant conducted the claim unreasonably as described above by referring to theRespondent a large number of nonsensical and in many cases non-existent citationswithout taking any or sufficient care to check them first. By not doing so she passed thework of checking them to the Respondent to have to do at short notice. My discretion toaward costs is engaged.46. However, I decline to award costs because AI is a relatively new tool which the public isstill getting used to, the Claimant acted honestly (and furthermore has presented her casehonestly to me over the last two days), and she tried to her best to rectify the situation assoon as she became aware of her mistake."
September 2025ANPV & SAPV v SOSHD“…suggested that the inaccuracies in the grounds were as a result of his drafting style. He accepted that there might have been some “confusion and vagueness” on his part; that he might “need to construct sentences in a more liberal way”; and that his drafting should perhaps “be a little more generous” when it came to making specific allegations about judges overlooking or failing to follow binding authorities. … The problems which I have detailed above are not matters of drafting style. The authorities which were cited in the grounds either did not exist or did not support the grounds of which were advanced. Where the cases did exist, they were often wholly irrelevant to the proposition of law which was given in the grounds.” (paragraphs 63 and 64)
October 2025AK v SOSHD UI-2025-002981"What concerns me in this case is not merely that there were false citations in the grounds of appeal considered by Judge Saffer; it is that those false citations were then removed from the grounds of appeal which were placed in the composite bundle. The former actions are unprofessional, the latter are potentially dishonest because it suggests that there was an attempt to conceal the false citations..."
October 2025Peters v Driver and Vehicle Standards Agency“9. I raise this because: 9.1 An appreciable amount of hearing time was taken up with trying to obtain copies of various reports in order that respondent’s Counsel (and I) could check the accuracy of the AI generated summaries. 9.2 There was a significant risk I could have been misled had this not been done. 9.3 Because of the demonstrated inaccuracies, I was unable to rely on the summaries. 9.4 The delay involved also caused or contributed to my Judgment being reserved.”“…He is genuinely seeking to assist a claimant who would otherwise be unrepresented. Nonetheless, it is important that some basic checks are done to ensure that the material put before the Tribunal is accurate in order to avoid the above. I refer to R (on the application of Ayinde) v London Borough of Haringey [2025] EWHC 1383 which clearly identifies the risk of not undertaking such checks and the importance of doing so…”
October 2025Oneto v Watson, et al., No. 22‑cv‑05206‑AMO (N.D. Cal. Oct. 10, 2025)"...To be clear, the Court does not prohibit or oppose the use of artificial intelligence in legal advocacy. The Court’s Civil Standing Order, excerpted above, makes clear that AI tools may be utilized so long as counsel acknowledges the use of such tools and certifies they have independently verified the accuracy of AI-generated content."
October 2025Malathi Latha Sriram (Mukti Roy) v Louise Mary Brittain“…rightly in my view, and I make no criticism of her. For what it is worth, I suspect, that, in common with many unrepresented parties, [Claimant] has resorted to research using the internet and has come up with false leads. The late Muir Hunter was an eminent member of the insolvency bar and the author for many years of an insolvency commentary that still bears his name. It is easy to see how his name could have come up in the course of an internet search and end up wrongly linked to a real case name and reference. The abbreviation BPIR stands for the Bankruptcy and Personal Insolvency Reports. They are not readily available to members of the public. It would have been difficult for [Claimant] to check the citation…”
October 2025Hassan v ABC International Bank PLC“On the use of AI in general, I happily accept that the internet is a resource many of us tend to rely on as providing expertise and knowledge where we lack it. Indeed, the facility for using a search engine has even been relied on in the EAT a reason for not granting an extension of time. I accept that AI is now at the forefront of internet searches. It might also be said that more intelligent and proficient users of the internet, like the Claimant, are more apt to use it in the way that the Claimant has i.e. to help construct arguments. I should not, and do not, approach the Claimant’s use of AI as in any way inherently negative”
October 2025Ndaryiyumvire v Birmingham City University48. I do have to take account of the fact that, as was said in Ayinde, the use of AI is a large and growing problem and the citing of fictitious or fake authorities is a serious threat to the integrity of the justice system which depends upon courts being able to rely on lawyers putting before the courts, whether orally or in documents, accurate material and accurate statements of the law supported by genuine cases. Lawyers who cite fictitious cases must face serious consequences and in the current environment where this is a significant and growing problem, the guidance in Ayinde indicates that judges should take a fairly tough line.
October 2025Lee v Blackpool B&B et al MAN/00EJ/HMG/2024/0011
"...I can only conclude that the ‘decision’ submitted to the Tribunal is a fabrication – whether or not it is the product of the injudicious use of artificial intelligence tools is unclear.”
October 2025Malathi Latha Sriram (Mukti Roy) v Louise Mary Brittain“Conspicuously absent from this judgment is any reference to case law. There are two reasons for that. The first is that I have felt able to deal with the issues without reference to authority because in my view they can be disposed of on the basis of the evidence and submissions. The second is this. Although in her skeleton arguments [Claimant] cites a great deal of case law, most of it is of little or no assistance. Mr Comiskey has prepared what he calls a “Note on Authorities”. He complains that [Claimant] has previously cited fake authorities and has sought to rely on authorities which provide no support for the propositions that purport to arise from them, and he says she has done it again here.” (Paragraph 37)“…The case name is clearly invented. Mr Comiskey draws the court’s attention to R (Ayinde) v London Borough of Haringey [2025] EWHC 1383 (Admin) and in particular two passages from the judgement: paragraph 26, in which it was held that placing false information (including fake authorities) before the court with the intention that the court treat it as genuine was a contempt if done knowingly, although not if done negligently; and the discussion in paragraph 88 of another case, Bandla v SRA [2025] 4 WLR 63 in which an appeal was struck out as an abuse of process for the citation and use of fake authorities.” (Paragraph 38)“…rightly in my view, and I make no criticism of her. For what it is worth, I suspect, that, in common with many unrepresented parties, [Claimant] has resorted to research using the internet and has come up with false leads. The late Muir Hunter was an eminent member of the insolvency bar and the author for many years of an insolvency commentary that still bears his name. It is easy to see how his name could have come up in the course of an internet search and end up wrongly linked to a real case name and reference. The abbreviation BPIR stands for the Bankruptcy and Personal Insolvency Reports. They are not readily available to members of the public. It would have been difficult for [Claimant] to check the citation…”
October 2025Victoria Place et al v Assethold Limited "85. I then typed the same wording into M365 Copilot on an Android device but adding a question mark at the end which gave a similar response, although the phrasing was markedly different, and it referred to the Upper Tribunal decision cited by [landlord’s managing agent] rather than the ‘hallucinated’ Court of Appeal citation. Repeating the same question sometime later would not re-produce reference to the Upper Tribunal decision, showing that AI adapts and an earlier answer may no longer be returned as the algorithm learns, demonstrating the care that needs to be taking in using AI. The idiom ‘shifting sands’ comes to mind.”
October 2025Green Building Initiative v Peacock and Green Globe Limited
November 2025Choksi v IPS Law LLP"...contains references to a number of cases that have wrong citations, wrong names or which simply do not exist. A number of the cases cited are wholly irrelevant and do not support the proposition in support of which they are cited...”
November 2025133 Blackstock Road (Hackney) RTM Company Limited v Assethold Limited“19.The Tribunal is extremely concerned that the Respondent has put material before it that is erroneous. [redacted] has failed to give any explanation as to how this error arose. One explanation mightbe the use of an AI LLM in the production of the Respondent’s statement of case.
November 2025Appeal in the cause of Jennings v Natwest Group Plc (Sheriff Appeal Court Civil)“[10] These require caution, the appellant having made submissions using ChatGPT, an artificial-intelligence database (see appellant’s supplementary submission). That may explain the generality of the submissions, which largely comprise free-form legal propositions with only limited link to the facts. It has served to complicate and obscure the true analysis of the issues. At least three of the cases cited appear to be non-existent.”
November 2025Oxford Hotel Investments Limited v Great Yarmouth Borough Council“…purported to quote at a little length from [18] of the judgment to the effect that a microwave satisfied the statutory definition. The problem is that the real [18] of Barker v Shokar says no such thing. Nor does any other part of the judgment in that case. [Director for the Appellant] ended up accepting that this misleading use of authority was the product of AI. It is one which illustrates again, in courts and tribunals, the dangers of using AI for legal research without any checks.”
December 2025Wemimo Mercy Taiwo v Homelets of Bath Limited & Ors“…This case does not exist (albeit the bogus reference can be ‘recreated’ through Google’s AI Overview function). There is a 2016 case in the Bolton County Court between the two named parties, but there was no appeal in 2018 to the Court of Appeal and [redacted] is a false reference
December 2025S Peggie v Fife Health Board and Dr B UptonDetails TBC
December 2025D (A Child) (Recusal)“Finally, I return to the issue raised by the father’s representatives about the mother’s erroneous citation of authority (see in particular paragraph 54 above). I absolve the mother of any intention to mislead the court. Litigants in person are in a difficult position putting forward legal arguments. It is entirely understandable that they should resort to artificial intelligence for help. Used properly and responsibly, artificial intelligence can be of assistance to litigants and lawyers when preparing cases. But it is not an authoritative or infallible body of legal knowledge. There are a growing number of reports of “hallucinations” infecting legal arguments through the citation of cases for propositions for which they are not authority and, in some instances, the citation of cases that do not exist at all. At worst, this may lead to the other parties and the court being misled. In any event, it means that extra time is taken and costs are incurred in cross-checking and correcting the errors. All parties – represented and unrepresented – owe a duty to the court to ensure that cases cited in legal argument are genuine and provide authority for the proposition advanced.”
January 2026Elden v HMRC [2026] UKFTT 41 (TC)93. In further submissions, the Representative said 'The suggestion that citing a published authority amounts to providing false material is misconceived. A court decision is a matter of public record. Whether a case applies is a matter of legal argument and opinion, not misrepresentation. It is entirely proper for parties to put forward different interpretations for the Tribunal to consider. To characterise this as "false material" is both unfounded and inappropriate.' It is not clear who the representative is quoting as saying false material was used. The wording used by HMRC was 'inaccurate use of AI/inaccurate authorities'.

The Older AI Hallucination Cases Tracker below

AI hallucination cases can, and do, reach real courtrooms and have real consequences.

What the AI Hallucination Cases Tracker is for.

Generative-AI tools sometimes invent case citations, statutes, articles or facts. When those errors slip into legal documents the consequences are serious. Sometimes elements get struck out, fines are imposed, lawyers sanctioned and in extreme cases maybe even committal proceedings launched. This page tracks publicly reported decisions and strives to be one reliable index of AI hallucination cases in live legal practice.

How to use the AI Hallucination Cases Tracker

  • Each row summarises one decision. You’ll see the country, the date, which AI tool was involved, a short note describing the hallucination, what was done about it or is proposed, and whether the person using AI was a lawyer, a litigant-in-person (self represented) or other.
  • Click Link in the last column to open a source it is hoped that the links will provide some authoritative source, but if it doesn’t and you are aware of one, please let us know. The best sources are usually a court website, a statutory legal archive such as BAILII, CanLII or CourtListener, or a PDF published directly by a disciplinary tribunal etc.
  • The table is searchable and sortable: type in the search box to filter by country, tool or sanction, or click any column header to sort.

Why you still need to read the original document
Although I try and check each entry carefully, this page is only a summary. The authoritative version of every decision is the one you must locate yourself. Procedures can move fast: sanctions may be appealed, recommendations can become final, costs can be varied and I may not update in time. Always read the original judgment, order or ruling and you should not rely on this table in practice, publication or teaching.

Keeping the tracker current
I try to add new AI hallucination cases as soon as they are brought to my attention. The page updates automatically because the table is managed in TablePress; whenever a new row is saved, the latest version appears here without extra work. Look for the “Last updated” line beneath the table to know when the most recent changes were made.

Scope
Not all hallucinations are confirmed. In some cases, it was merely alleged. Alleged hallucinations may be included if the allegation itself was significant to the proceeding.

Why the problem matters

  1. Accuracy is fundamental. Judges rely on counsel to supply real authorities. AI hallucination cases erode trust in written advocacy and evidence.
  2. Professional duties are evolving. Many ethics codes now say lawyers must understand the technology they use. Sanctions in these decisions often reference competence rules. This is discussed elsewhere on this blog.
  3. Public confidence is important. When headlines report AI inventing cases, the public questions the legal system’s rigour. Tracking responses across jurisdictions shows how the profession is confronting the risk.

Caution
This tracker is provided for general information only and is not legal advice. The summaries and links do not replace the official judgment, and if you wish to rely on any case you must obtain the authoritative version yourself, direct from the court, tribunal or official reporter as appropriate. Do not rely solely on the information or links shown here. Before taking any action, obtain independent, qualified legal advice.

Last updated: 16 May 2025

Case NameCountryDateAI ToolHallucinationJudicial Observation (Quote)NotesWho Used AILink
Kuzniar v General Dental Council (ET 6009997/2024)UK20 August 25ChatGPT/OtherFalse Citations“The Claimant conducted the claim unreasonably as described above by referring to the Respondent a large number of nonsensical and in many cases non-existent citations without taking any or sufficient care to check them first. By not doing so she passed the work of checking them to the Respondent to have to do at short notice. My discretion to award costs is engaged.

Furthermore, although I did not make any formal enquiry into her financial means, she told me that she has only £2000 in the bank and is struggling to find work as a dentist because of the conditions imposed by the Respondent.

However, I decline to award costs because AI is a relatively new tool which the public is still getting used to, the Claimant acted honestly (and furthermore has presented her case honestly to me over the last two days), and she tried to her best to rectify the situation as soon as she became aware of her mistake.” (paras 45-47)
No sanction or costs order due to new tool, honestly, and best efforts to rectify mistakeLitigant in Personlink
Father v Mother [2025] EWHC 2135 (FamUK30 July 25UnspecifiedFalse Citations“(16) The F then made a further application on a C2 asking that HHJ Bailey recuse herself on the basis of being biased against him and her not understanding ASD and the impacts of his diagnosis. This came before the Judge on 10 June 2025. In his written application to the court the F referred to a number of previous authorities, in particular relating to ASD. HHJ Bailey realised that many of these cases were not genuine, and the submission appeared to have been generated by Artificial Intelligence (“AI”). In light of the level of recent concern about litigants and lawyers using AI and referring to cases which are not genuine (as reflected in the Divisional Court decision R (Ayinde) v London Borough of Haringey [2025] EWHC 1383), HHJ Bailey referred the case to me as the Family Presiding Judge for the Midlands.”

“The F relied upon faked cases without apparently making any effort to check their veracity. It is in my view important to note that the F is someone who is well capable of checking references and ensuring documents are accurate if it is in his interests to do so.”

Judge ordered F to pay the costsLitigant in Personlink
HMRC v Gunnarsson [2025] UKUT 247UK23 July 2025UnspecifiedFalse Citations“…In this case, HMRC was put to the trouble of having to investigate the existence of the purported decisions relied upon by the Respondent. Fortunately, they did so. Depending on the circumstances, there may be occasions when the opposing party or the tribunal are not able to discover the errors relied upon. There may be others where an adjournment is required to investigate or address the inaccurate information…”Warning to all Court usersLitigant in Personlink
Ms (Bangladesh) v SoS for Home Department UK1 July 2025ChatGPTUnclear if Hallucination13. We sought clarification regarding this citation and reference and asked for the relevant paragraph of the judgment being relied on. [counsel] was not able to specify this. [counsel] submitted that he understood, having used ChatGBT, that the Court of Appeal in Y (China) [2010] EWCA Civ 116 was presided by Pill LJ, LJ Sullivan LJ and Sir Paul Kennedy. However, the citation [2010] EWCA Civ 116 did not point to the case of Y (China) but to R (on the application of YH) v SSHD. We raised concern about this and referred [counsel] to the recent decision of the President of King’s Bench Division in Ayinde [2025] EWHC 1383 (Admin) on the use of Artificial Intelligence and fictitious cases, and directed him to make separate representations in writing.

14. In his subsequent written representations, [counsel] clarified that Y(China) was a typological error and he sought to rely on R (on the application of YH) v SSHD [2010] EWCA Civ 116 where, when discussing the meaning of ‘anxious scrutiny’ in asylum claims…”
TBCLawyerlink
UB v SoS for Home DepartmentUK18 June 25UnspecifiedFalse CitationsThe drafter then “inadvertently uploaded the draft version [of the Grounds] rather than the final one”. The Judge accepted the explanation and that solicitors had:

“…recognised this seriousness of this issue and has taken commendable steps to ensure it will not be repeated including (i) meeting with the caseworker who drafted the Grounds; (ii) holding a partners’ meeting to discuss adopting an AI policy and assigning the task of finalising an AI policy to a colleague in consultation with an AI professional; (iii) conducting relevant in-house training and issuing interim AI Guidance and (iv) planning for comprehensive staff training by an AI professional….”
WarningLawyerlink
(BL O/0559/25)UKJuly 2025ChatGPTType 7-8 potentially“As identified in Ayinde (including in the Appendix setting out domestic and overseas examples of attempts to rely on fake citations), fabrication of citations can involve making up a case entirely, making up quotes and attributing them to a real case, and also making up a legal proposition and attributing it to a real case even though the case is not relevant to the legal proposition being made (for instance, it deals with a completely different issue or area of law). It is not, however, fabrication to make an honest mistake as to what a court held in a particular case or to be genuinely mistaken as to the effect of a court’s judgment. In any event, it does not matter whether fabrication was arrived at with or without the aid of generative artificial intelligence. I therefore need to consider what if any sanction is appropriate.”Warning to bothLitigant in Person and Trade Mark Attorneylink
Shahid v EsaamUSJune 25AI not certainFake cases“We are troubled by the citation of bogus cases in the trial court’s order. As the reviewing court, we make no findings of fact as to how this impropriety occurred, observing only that the order purports to have been prepared by Husband’s attorney, …. We further note that Lynch had cited the two fictitious cases that made it into the trial court’s order in Husband’s response to the petition to reopen, and she cited additional fake cases both in that Response and in the Appellee’s Brief filed in this Court.”Judge cited fake cases in court orderLawyer and Judgelink
Jakes v YoungbloodUSJune 25AI use disputed.
False Citations“These quotations are merely representative of the fabricated statements in [B’s] briefs. There are additional fabricated quotations that the Court does not enumerate in this order. In addition to including non-existent quotations in his briefs, Blackburn also cited cases for propositions that they do not represent. The Court will not recite every time [B] misconstrued a case in his briefs as it believes the above quotations represent the most serious and alarming issues with the documents. Attorneys are permitted to make creative case comparisons and may even stretch existing case law to support their arguments. Nevertheless, advocacy is confined by Rule ll(b) and Pa. RPC 3.3. Attorneys have a duty of candor to the Court. They must make reasonable inquiries under the circumstances to ensure their legal contentions are warranted by existing law. Attorneys may not fabricate non-existent quotations, from case law or the Court’s opinion, and may not cite cases for legal propositions for which they do not stand (or even discuss).”Further hearing listed 24 July 2025LawyerLink
Not specified – referred to in Ayinde below.UKApril 25UnspecifiedNon existent cases.“That was a case before the County Court … That counsel drew attention to the fact that the application before the judge contained false material: specifically the grounds of appeal and the skeleton argument settled … contained references to a number of cases that do not exist….”Judge satisfied by assurances given to court by Barrister and Head of Chambers not to take further action.LawyerLink


Link
Mid Central Operating Engineers HWF v Hoosiervac LLCUSMay 25UnspecifiedFake cases“That said, in considering an appropriate sanction the Court takes into account the steps [Lawyer] has taken “to educate himself on the responsible use of AI in legal practice” and adhere to “the highest standards of professional conduct moving forward.” See dkt. 102 at 2. The Court also considers the collateral consequences that [Lawyer] has experienced, and may continue to experience, from having improperly relied on non-existent AI-generated legal citations…so, the court has considered those circumstances alongside its interest in deterring careless or reckless attorney conduct.”Sanction of $6,000 not accepting recommended $15,000LawyerLink
Versant Funding LLC v Teras Breakbulk OceanUSMay 25Unspecifiedadmitted submission of a wholly fabricated “hallucinated” case citation“In the Court’s view, there is nothing inherently wrong with an attorney properly and competently utilizing AI or any of its subsets to practice law or litigate cases. [But a] basic prerequisite to the filing of any … paper in court is for the drafting and filing attorney(s) to carefully check every case citation, fact, and argument to make sure that they are correct and proper. Attorneys cannot delegate that role to AI, computers, robots, or any other form of technology.”Pay counsel’s fees. Complete CLE on AI. $1,000 and $500 fine.Lawyer
Alharoun v Qatar National Bank and QNBUKMay 25TBCTBC“In CL-2024-000435, it appears from the Order of Mrs Justice Dias that correspondence was sent to the court, and witness statements were filed, citing authorities that do not exist and claiming that other authorities contained passages that they do not contain” Rt Hon. Dame Victoria SharpTBC – Not clear if AI but linked with Ayinde v Haringey below so keep under review. TBCLink
R (Ayinde) v HaringeyUKMay 25AI use disputed.Citation of Fake Cases“ It is such a professional shame. The submission was a good one. The medical evidence was strong. The ground was potentially good. Why put a fake case in?”

“I should say it is the responsibility of the legal team, including the solicitors, to see that the statement of facts and grounds are correct.”


“…I consider that it would have been negligent for this barrister, if she used AI and did not check it, to put that text into her pleading.”

Mr Justice Richie
Wasted costs £2,000 Counsel and £2,000 Solicitor and referral to regulators.
FURTHER HEARING 23 May
Lawyers (accused not found)Link


Link
Willis v Bank National AssociationUSMay 25N/ACourt discusses pros and cons of AI and allows careful use. “[b]y presenting to the court a pleading, written motion, or other paper – whether by signing, filing, submitting, or later advocating it – an attorney or unrepresented party certifies that to the best of the person’s knowledge, information, and belief, formed after an inquiry reasonable under the circumstances ․ the claims, defenses, and other legal contentions are warranted by existing law…”

“…[c]onfirming a case is good law is a basic, routine matter and something to be expected from a practicing attorney…”
Court issued standing order – rules in relation to AI useN/Alink
Concord MG v Anthropic PBCUSAMay 25Claud (Anthropic)Allegation cited article was invented by Claude“The
Court gave Anthropic time to investigate the circumstances surrounding the challenged citation.
…the Court finds this issue is a serious one—if not quite so grave as it at first appeared.
Anthropic’s counsel protests that this was “an honest citation mistake” but admits that
Claude.ai was used to “properly format” at least three citations and, in doing so, generated a
fictitious article name with inaccurate authors (who have never worked together) for the citation at
issue… That is a plain and simple AI hallucination. Yet the underlying article
exists, was properly linked to and was located by a human being using Google search; so, this is
not a case where “attorneys and experts [have] abdicate[d] their independent judgment and critical
thinking skills in favor of ready-made, AI-generated answers….”

“…A remaining serious concern, however, is Anthropic’s attestation that a “manual citation check”
was performed but “did not catch th[e] error.”
Written explanation OrderedExpert/LawyerLink
Nexgen Pathology Services Ltd v Darceuil DuncanTrinidad & TobagoMay 25Unverified Google/ChatGPT research (admitted “online source now unavailable”)non‑existent cases.69. The Court acknowledges that digital tools including AI and internet-based platforms,
are increasingly common and valuable in legal research; indeed, this Court itself makes
use of such tools where appropriate. However, their use must be accompanied by
discernment and subjected to rigorous verification. This is because AI-generated
content is susceptible to producing what are commonly referred to as “hallucinations”:
fabricated, yet plausible-sounding outputs that may result from gaps or limitations in
the model’s underlying data. Legal practitioners must not rely on such tools uncritically. Any information obtained through these means must be independently
verified before being presented to the Court.
70. The Court emphasizes that citing non-existent cases, even inadvertently, constitutes a
serious abuse of process and professionalism. It risks misleading the Court, prejudicing
the opposing party, and eroding public confidence in the administration of justice. Counsel are reminded that the duty of candour to the Court requires that they verify
the authenticity of every case cited. If any material has been generated with the
assistance of AI or other non-traditional sources, full disclosure to the Court is both
appropriate and expected.”

The Hon. Mr. Justice Westmin R.A. James
condemned the “irresponsible use of generative AI,” struck the authorities, referred both counsel to the Law Association’s Disciplinary Committee, and reminded practitioners of their duty to verify AI‑derived materialLawyerLink
Ko v LiCanadaMay 25ChatGPT (suspected)multiple non-existent or mis-linked authorities[14] This occurrence seems similar to cases in which people have had factums drafted by generative artificial intelligence applications (like ChatGPT). Some of these applications have been found to sometimes create fake legal citations that have been dubbed “hallucinations.” It appears that Ms. Lee’s factum may have been created by AI and that before filing the factum and relying on it in court, she might not have checked to make sure the cases were real or supported the propositions of law which she submitted to the court in writing and then again orally.

FL Myers J



show cause why should not be cited for contempt.LawyerLink
Reclamação 78.890BrazilMay 25TBC – awaiting verified English translation of the judgment.TBC – awaiting verified English translation of the judgment.TBC – awaiting verified English translation of the judgment.Summary withheld until a reliable translation is available. Readers should consult the original Portuguese decision linked in the final column.TBCLink
Lacey v State Farm General InsUSMay 25Google Gemini + Westlaw Precision (CoCounsel)approximately nine of the 27 legal
citations in the ten-page brief were incorrect in some way. At least two of the
authorities cited do not exist at all. Additionally, several quotations
attributed to the cited judicial opinions were phony and did not accurately
represent those materials.
supplemental briefs struck, no
further discovery relief, Lawyers to pay
compensation $31,100
LawyerLink
Ramirez v HumalaUSMay 25UncertainReply letter cited four non-existent cases.$1,000 monetary sanction and service of order on clientLawyerLink
Bandla v SRAUKMay 25Google Searches, not Gen-AIfake or mis-described case authorities in appeal grounds and skeleton.Citation caused grounds of appeal to be struck out as abuse of process. Litigant in Person (former solicitor) Link
ZZaman v Revenue & CustomsUKApr 25UnspecifiedFalse Cases29. However, our conclusion was that Mr Zzaman’s statement of case, written with the assistance of AI, did not provide grounds for allowing his appeal. Although some of the case citations in Mr Zzaman’s statement were inaccurate, the use of AI did not appear to have led to the citing of fictitious cases (in contrast to what had happened in Felicity Harber v HMRC [2023] UKFTT 1007 (TC) ). But our conclusion was that the cases cited did not provide authority for the propositions that were advanced. This highlights the dangers of reliance on AI tools without human checks to confirm that assertions the tool is generating are accurate. Litigants using AI tools for legal research would be well advised to check carefully what it produces and any authorities that are referenced. These tools may not have access to the authorities required to produce an accurate answer, may not fully “understand” what is being asked or may miss relevant materials. When this happens, AI tools may produce an answer that seems plausible, but which is not accurate. These tools may create fake authorities (as seemed to be the case in Harber ) or use the names of cases to which it does have access but which are not relevant to the answer being sought (as was the case in this appeal). There is no reliable way to stop this, but the dangers can be reduced by the use of clear prompts, asking the tool to cite specific paragraphs of authorities (so that it is easy to check if the paragraphs support the argument advanced), checking to see the tool has access to live internet data, asking the tool not to provide an answer if it is not sure and asking the tool for information on the shortcomings of the case being advanced. Otherwise there is a significant danger that the use of an AI tool may lead to material being put before the court that serves no one well, since it raises the expectations of litigants and wastes the court’s time and that of opposing parties.Guidance of how to avoid issues. Litigant in Personlink
Saxena v Martínez-Hernández 2025 WL 1194003
USApr 25UnspecifiedAI hallucinations that could not be located.Warning that Gen AI citations may trigger sanctionsLitigant in Personlink
Bevins v Colgate-Palmolive CoUSApr 25UnspecifiedBoth citations appear to be artificial intelligence (“AI”) “hallucinations” made up of parts
of actual cases [read full footnote 10]
serve order on Bar, appearance stricken, inform and new counsel.LawyerLink

Link
Coomer v Lindell et al (My Pillow)USApr 25Unspecifiedthe Court identified nearly thirty defective citationsOrder to Show Cause directing counsel and firm to explain why they, their clients and in-house lawyers should not be sanctioned and referred to disciplinary authorities, and to certify personal service of the order. LawyerLink
Dehghani v. CastroUSApr 25ChatGPT (likely the handiwork of a ChatGPT-style AI program’s hallucination)Six non-existent authorities $1,500 fine, 1-hour CLE AI/Ethics and report to BarLawyerLink
Benjamin v Costco Wholesale CorpUS
Apr 25ChatOn (mobile generative-AI app)Reply in support of a motion to remand cited four non-existent cases created by ChatOn$1 000 fine, ordered counsel to serve the order on her client and file proof of service, LawyerLink
Williams v Capital One USMar 25CoCounselNon existent cases – Judge named the false citations in judgment“It is not acceptable for parties to submit “filings to the Court containing citations to legal
authority that does not exist, whether drafted with the assistance of artificial intelligence or not. “
Dismissed and warning. Lawyerlink
Nguyen v Savage EnterprisesUSMar 25Uncertain (AI… may have crept in)citing nonexistent authority in briefshow causeLawyerLink
Mid Central Operating EngineersUSFeb 25Unspecified Three fake case citations.$15,000 in sanctions (US $5 000 per brief), referral for further professional discipline, and an order that counsel notify.LawyerLink
Unnamed appeal (TJ Santa Catarina, Boletim-TJSCBrazilFeb 25TBC – awaiting verified English translation of the judgment.TBC – awaiting verified English translation of the judgment.Summary withheld until a reliable translation is available. Readers should consult the original Portuguese decision linked in the final column.TBCLink
Wadsworth v Walmart IncUSFeb 25MX2.lawEight non-existent authoritiesMain counsel’s pro hac vice admission revoked; barred from case; fined US $3,000, plus US $1,000 fines for certain other counsel.LawyerLink
Bunce v Visual Tech. Innovations, IncUSFeb 25ChatGPTnon-existent and mis-characterised casesShow cause, US $2 500 penalty, completion of a one-hour CLE on AI & ethics, and warned further breaches would draw harsher penLawyerLink
Valu v Minister for ImmigrationAusJan 25UnspecifiedFabricated citationslawyer referred to regulator and caution given about AILawyerlink
Olsen v Finansiel Stabilitet UKJan 25UnspecifiedThe appellants submitted a summary of a non-existent case, which was included in their authorities bundle.“I have narrowly and somewhat reluctantly come to the conclusion that I should
not cause a summons for contempt of court to be issued to the appellants under
CPR rule 81.6. I do not think it likely that a judge (whether myself or another
judge) could be sure, to the criminal standard of proof, that the appellants knew
the case summary was a fake. They may have known but they could not be
compelled to answer questions about the identity of the person who supplied it.”

Mr Justice Kerr
While considering contempt proceedings, the judge ultimately decided against it, noting the appellants’ lack of legal representation and cooperation.Litigant in Personlink
Mavundla v MECSouth AfrricaJan 25ChatGPT/Meta (suspected)cited multiple non-existent or mis-quoted authorities.Referred to
Legal Practice Council for investigation and further action. Attorneys pay costs of certain appearances.
LawyerLink
Kohls and Franson v EllisonUSJan 25GPT-4oTwo non-existent academic articles and mis-attributed a third.Expert witness evidence excluded. Expert WitnessLink
United States v HayesUSJan 25UnspecifiedDiscussion about citation hallucination and citation errorShow cause. LawyerLink
Gauthier v Goodyear Tire & RubberUSNov 24Claud (Anthropic)summary-judgment response quoted two imaginary cases and several fabricated quotations generated with Claude; after a show-cause order, counsel admitted the error $2 000 penalty, required completion of a 1-hour CLE on AI/ethics, and ordered service of the order on the clientLawyerLink
Handa v MallickAusJul 24LEAPgenerated list of authoritiespractitioner afforded an opportunity make submissions as to why conduct in tendering the list of authorities should not be referred Lawyerlink
Grant v City of Long Beach 96 F.4th 1255USMar 24Unspecified filed an opening brief
replete with misrepresentations and fabricated case law. The
brief included only a handful of accurate citations, almost all
of which were of little use to this Court because they were
not accompanied by coherent explanations of how they
supported appellants’ claims.
struck brief and dismissed appeal. Lawyer link
Zhang v ChenCanadaFeb 24ChatGPTCitation of two non-existent cases.Masuhara J. held:

“…Citing fake cases in court filings and other materials handed up to the court is an abuse of process and is tantamount to making a false statement to the court. Unchecked, it can lead to a miscarriage of justice…”
Lawyer pay costs personally and review filesLawyerLink
Park v KimUSJan 24ChatGPTSingle non-existent state-court decision cited in the appellant’s reply briefLawyer referred to grievance panel and serve order on client.LawyerLink
Vanguard Construction & Development Co. v 400 Times Square Associates, LLC (2025).US24ChatGPT (alleged)the brief was accused of being AI-generated, but the court found the insinuation unsubstantiatedCourt admonished counsel for making an evidence-free allegation; no sanctions were imposedLawyers (accused not found)Link
Crypto Open Patent Alliance v Dr. Craig Steven WrightUK24ChatGPT (likely not certain)Series of authorities that do not contain passages attributed. false references unlikely to be deliberate. Principles not in doubt so court didn’t engage further. Litigant in PersonLink
Sala Primera del Tribunal Constitucional – Nota Informativa 90/2024SpainTBCTBC – awaiting verified English translation of the judgment.TBC – awaiting verified English translation of the judgment.Summary withheld until a reliable translation is available. Readers should consult the original Portuguese decision linked in the final column.TBCLink
Al-Hamim v Star Hearthstone 2024COA128USDec 24UnspecifiedFake authoritiesDeclined sanctions but warned future penalties for AI misuseLitigant in Personlink
Mortazavi v HamiltonUSSep 24UncertainMotion to remand contained a citation to a non-existent case and other errors produced with AI; declaration (no AI disc)Order to Show Cause LawyerLink
DayalAusAug 24Unnamed AI in practice-management suiteNon existent casesReferred to professional bodyLawyerlink
US v CohenUSMar 24Google BardNon-existent casesCourt declined to impose sanctions for the citation in that motion to nonexistent cases.Lawyer“` :contentReference[oaicite:0]{index=0}
Kruse v KarlenUSFeb 24Unspecified generative-AI via
hired online “consultant” purporting to be an attorney to prepare the Brief. Indicated that the fee paid amounted to less than one percent of the cost of retaining an attorney.
Apologised for submitting fictitious cases. Did not know. denied intention to mislead.court branded the appeal frivolous, dismissed it and damages.Litigant in PersonLink
Moffatt v Air CanadaCanadaFeb 24website customer-service chatbotChatbot suggested passenger could apply for bereavement fares retroactively. Passenger later learned airline did not permit retroactive applications.Negligent Misrep. $650.88 damages, CA $36.14 interest, and CA $125 tribunal fees.ChatbotLink
Smith v FarwellUSFeb 24UnspecifiedFabricated US Supreme Court cases$2,000 sanction on counsel and warning to bar.Lawyerlink
re Thomas G. NeusomUSJan 24Unspecifiedinaccurate citations and fabricated authoritiesSuspension and corrective steps.LaywerLink

Link



Will of SamuelUSJan 24Unspecified Multiple fictional citationsFurther hearing set.LawyerLink
Harber v HMRCUK Dec 23ChatGPTNine fabricated FTT decisions.Tribunal criticised the wasted time and money caused.Litigant in PersonLink
Thomas v Pangburn 2023 WL 9425765USOct 23Unspecifiedmisleading citationsCourt highlighted dangersLitigant in Person link
People v CrabillUSNov 23ChatGPTCited non-existent cases and mis-described cases.one year and one day suspension, with ninety days to be served and the remainder to be stayed upon successful completion of a two year period of probation, with conditions.LawyerLink
Ex parte LeeUSJul 23UnspecifiedFalse case citationcourt noted brief “may have been prepared by AI” but imposed no sanctions; stressed verification dutyLawyerlink
Parker v Forsyth N.O.South AfricaJun 23ChatGPTCases cited – names and citations are fictitious,
the facts are fictitious, and the decisions are fictitious.
“The Plaintiff’s attorneys used this artificial intelligence medium to
conduct legal research and accepted the results that it generated
without satisfying themselves as to its accuracy. As it turned out, the
cases listed above do not exist. The names and citations are fictitious,
the facts are fictitious, and the decisions are fictitious. The Plaintiff’s
counsel was constrained to concede as much. ” [87]

A Chaitram Regional Magistrate

Judge held the lawyers had shown “undue faith in AI” and, although not wilful, were “over-zealous and careless.”

“The embarrassment associated with this incident is
probably sufficient punishment for the Plaintiff’s attorneys.”
LawyerLink
Scott v Federal National Mortgage Ass’n (Me. SupeUSJun 23Unspecifiedfictitious authoritiesComplaint dismissed and Rule 11 sanctions orderedLitigant in Personlink

Generative AI is evolving rapidly, and so are the legal and ethical standards surrounding its use. By following this tracker you can see how courts across the globe respond to fabricated citations, ranging from mild responses to hefty sanctions. Please keep revisiting. New decisions arrive often, reinforcing the importance of verifying sources and practising diligent, technology-aware advocacy in every jurisdiction, daily.

AI Hallucination Cases Tracker FAQ

Why are they called “Hallucinations”?

This term is commonly used, but amongst lawyers, there is some discussion about whether it is the correct expression. For example in JML Rose Pty Ltd v Jorgensen (No 3) [2025] FCA 976 (Federal Court of Australia, 19 August 2025):

“…Although the termed used in relation to erroneously generated references by Al is “hallucinations”, this is a term which seeks to legitimise the use of Al. More properly, such erroneously generated references are simply fabricated, fictional, false, fake and as such could be misleading…”

For the AI hallucination cases tracker the term “hallucinations” is maintained due to its common usage and searchability. This was the term also recently adopted in the updated AI Guidance for Judicial Office Holders:

“Hallucination: AI hallucinations are incorrect or misleading results that AI models generate. These errors can be caused by a variety of factors, including insufficient training data, the model’s statistical nature, incorrect assumptions made by the model, or biases in the data used to train the model.”

My report and link to the full guidance can be read here.

Why don’t you quote the full hallucination and the hallucinated principle, even when judges do?

I am concerned that well-intentioned judges often cite AI hallucination cases and their erroneous legal principles in full within official judgments to show the extent of the problem to those reading. However, judges may be inadvertently exacerbating the issue because those inaccuracies are being integrated into the established legal canon indirectly.

I do record in my own research the full fabricated/false citations for analysis. If you would like to discuss access to this research please contact my clerks.

Why do you often choose to remove lawyers’ names from quotations in your commentary?

I often remove the names of specific lawyers in quoted material to keep the focus on the legal reasoning, judicial comments, or case outcome rather than on individuals. Full details can be read by clicking the links to the judgment.