Category Uncategorized

Fabricated Citations 11th Incident in UK revisited in UT(IAC): Hamid Judgment on AI hallucinations in Ms (Bangladesh) v SSHD

Hamid judgment on AI hallucinations

"The Divisional Court has provided guidance in the case of R (Ayinde) v London Borough of Haringey, Al-Haroun v Qatar National Bank QPSC [2025] EWHC 1383 (Admin) that the consequence of using AI large language models in a way which results in false authorities being cited is likely to be referral to a professional regulator, such as the BSB or SRA,  as it is a lawyer’s professional responsibility to ensure that checks on the accuracy of citation of authority or quotations are carried out using reputable sources of legal information. Where there is evidence of the deliberate placing of false material before the Court police investigation or contempt proceedings may also be appropriate"

14th UK case (Employment) Kuzniar v GDC: Who Should Bear the Legal Costs Where Litigant Cites False Citations/AI Hallucinations?

False Citations/AI Hallucinations

This case illustrates the practical challenges created when litigants rely on AI-generated legal content without adequate verification. The Tribunal carefully balanced the Respondent’s wasted costs against the Claimant’s honest but misguided use of AI, ultimately declining to award costs. The judgment reflects some tribunals’ willingness to treat such mistakes with leniency, while still recognising the additional burdens placed on opposing parties.

AN INTRODUCTION to Natural & Artificial Intelligence in Law Blog – AI Law, Rights, and Technology

AI Hallucinations

Welcome to Natural and Artificial Intelligence in Law. Natural and Artificial Intelligence in Law is a professional resource at the intersection of AI law, human rights, equality, housing and civil justice. Curated by barrister Matthew Lee and now enjoying an international readership, the blog offers expert commentary, practical guidance, and live trackers AI Law. Read all about the project here.

AI Hallucinations Derail Hearing in Melbourne Murder Case: Lawyers including King’s Counsel Held Responsible

"AI hallucinations are not a minor inconvenience. They are a professional risk that can catch out even the most experienced practitioners and judges. Sometimes this comes down to simple carelessness, but sometimes it is because of how convincing and subtle these errors can be, especially when we are working under the extreme pressures so common in this profession. That is why, as a profession, I believe we need to keep talking about them openly. We are all vulnerable. Even those who have told me, “I completely avoid AI,” cannot truly do so. It may appear in your instructions, in witness evidence, or in your opponent’s work, and you have a duty to recognise and address it."