Government AI Hallucination Tracker

Government AI Hallucination Tracker

The Government AI Hallucination Tracker examines instances where false, fabricated, or legally unreliable material (often called “hallucinations”) appears within official government policy documents, consultation papers, reports, and legislative materials. This includes both AI-generated hallucinations and non-AI citation errors, such as invented case law, misquoted judgments, inaccurate statutory references, or misleading factual claims.

Read MoreGovernment AI Hallucination Tracker

Deep Fakes in Civil Litigation: Why CPR 32.19 Is Essential for Protecting Evidence Integrity

Deep Fakes in Civil LItigation

“…AI tools are now being used to produce fake material, including text, images and video. Courts and tribunals have always had to handle forgeries, and allegations of forgery, involving varying levels of sophistication. Judges should be aware of this new possibility and potential challenges posed by deepfake technology”
AI Guidance for Judicial Office Holders

Read MoreDeep Fakes in Civil Litigation: Why CPR 32.19 Is Essential for Protecting Evidence Integrity

Updates on False Legal Citations Cases in the E&W High Court: Could Good Judicial Intentions Lead to Unintended Consequences?

“… I read their brief, was persuaded (or at least intrigued) by the authorities that they cited, and looked up the decisions to learn more about them – only to find that they didn’t exist. That’s scary. It almost led to the scarier outcome (from my perspective) of including those bogus materials in a judicial order. Strong deterrence is needed to make sure that attorneys don’t succumb to this easy shortcut”

Judge Wilner - Lacey v State Farm

Read MoreUpdates on False Legal Citations Cases in the E&W High Court: Could Good Judicial Intentions Lead to Unintended Consequences?

AI Equality, Bias and AI Discrimination Case Tracker

AI Discrimination Case Tracker

The AI Equality, Bias and AI Discrimination Case Tracker focuses on court and tribunal cases involving equality, bias, and discrimination linked to the use of AI. It highlights both the challenges and opportunities that AI brings to equality law, capturing examples where AI systems have contributed to unfair outcomes as well as where they have been used to promote fairness, accessibility and inclusion.

Read MoreAI Equality, Bias and AI Discrimination Case Tracker

Judicial AI Use Tracker (How are Judges Using AI?)

Judicial AI Use Tracker

The Judicial AI Use Tracker monitors how judges are using AI and records any official guidance, commentary, or policy developments on judicial use.

This tracker captures both the benefits and challenges of AI in judicial contexts ranging from tools that enhance efficiency, consistency, and access to justice, to cases and discussions highlighting risks. It aims to provide a balanced picture of how AI is shaping judicial reasoning, case management, and evidence assessment.

Read MoreJudicial AI Use Tracker (How are Judges Using AI?)

AI Hallucination Cases Tracker (AI and non-AI fabricated/false citations)

AI Hallucination Cases Tracker

AI hallucination cases can, and do, reach real courtrooms and have real consequences. However, there is some dispute about the correct phrasing of this phenomenon:

“…Although the termed used in relation to erroneously generated references by Al is "hallucinations", this is a term which seeks to legitimise the use of Al. More properly, such erroneously generated references are simply fabricated, fictional, false, fake and as such could be misleading...”
JML Rose Pty Ltd v Jorgensen (No 3) [2025] FCA 976 (Federal Court of Australia, 19 August 2025)

Read MoreAI Hallucination Cases Tracker (AI and non-AI fabricated/false citations)