AI Hallucination Cases Tracker (AI and non-AI fabricated/false citations)

AI hallucination cases can, and do, reach real courtrooms and have real consequences. However, there is some dispute about the correct phrasing of this phenomenon:

“…Although the termed used in relation to erroneously generated references by Al is “hallucinations”, this is a term which seeks to legitimise the use of Al. More properly, such erroneously generated references are simply fabricated, fictional, false, fake and as such could be misleading…”
JML Rose Pty Ltd v Jorgensen (No 3) [2025] FCA 976 (Federal Court of Australia, 19 August 2025)