Deep Fakes in Civil Litigation: Why CPR 32.19 Is Essential for Protecting Evidence Integrity

“…AI tools are now being used to produce fake material, including text, images and video. Courts and tribunals have always had to handle forgeries, and allegations of forgery, involving varying levels of sophistication. Judges should be aware of this new possibility and potential challenges posed by deepfake technology”
AI Guidance for Judicial Office Holders

Ad/Marketing Communication

This legal article/report forms part of my ongoing legal commentary on the use of artificial intelligence within the justice system. It supports my work in teaching, lecturing, and writing about AI and the law and is published to promote my practice. Not legal advice. Not Direct/Public Access. All instructions via clerks at Doughty Street Chambers. This legal article concerns AI Law.

Deep Fakes in Civil Litigation

Introduction

In civil procedure, if a party discloses a document and you do not serve a notice to prove under CPR 32.19, you are taken to admit the authenticity of that document. This is often overlooked, but in an age of “deep fakes” and before the Civil Procedure Rules (CPR) are amended, it is one of the most important rules to keep in mind. I will consider deep fakes in civil litigation and one of the most important procedural safeguards available.

What Are Deep Fakes?

Deep fakes are artificially generated or manipulated images, audio, or videos that use advanced machine learning techniques, often involving neural networks, to create highly realistic fabrications. They can show events that never happened, make people appear to say things they never said, or alter the content of genuine documents or media.

Warnings have been given in numerous guidance documents. The AI Guidance for Judicial Office Holders currently states:

7) Be aware that court/tribunal users may have used AI tools…

“…AI tools are now being used to produce fake material, including text, images and video. Courts and tribunals have always had to handle forgeries, and allegations of forgery, involving varying levels of sophistication. Judges should be aware of this new possibility and potential challenges posed by deepfake technology”

The Online Disinformation and AI Threat Guidance for electoral candidates and officials also outlines risks from generative AI:

  • Fake Text: Easily generated content that can mislead or confuse. Example: Bots causing voter confusion in London mayor elections (BBC News).
  • Fake Images: Manipulated images affecting voter trust and turnout.
  • Fake Videos (Deepfakes): Videos falsely representing individuals in misleading scenarios. Example: Martin Lewis deepfake scam (BBC News).
  • Fake Audio: Artificial audio creating deceptive narratives. Example: Fake Biden robocalls misguiding voters (BBC News).

Deep fakes in civil litigation could appear where a party submits what looks like legitimate evidence, say, a purported email exchange of a conversation or a scanned version of a contract and relies on its authenticity. If it has actually been artificially created or manipulated, this would undermine the entire evidential basis of a case.

Why CPR 32.19 is Critical for Challenging Deep Fakes in Civil Litigation & Evidence

The admission of authenticity by default (through not serving a notice to prove under CPR 32.19) could have serious implications if a deep fake is mistakenly taken as genuine evidence. Once authenticity is admitted, it is considerably harder to challenge the document’s veracity later. This is why promptly reviewing disclosed documents and deciding whether to serve a notice to prove is crucial in safeguarding against the threat of manipulated or falsified evidence.

Key Provisions of CPR 32.19

The rule itself is straightforward, but its implications for deep fakes are significant. As of today, the Rule reads:

Notice to admit or produce documents – 32.19

(1) A party shall be deemed to admit the authenticity of a document disclosed to him under Part 31 (disclosure and inspection of documents) unless he serves notice that he wishes the document to be proved at trial.

(2) A notice to prove a document must be served –

(a) by the latest date for serving witness statements; or

(b) within 7 days of disclosure of the document, whichever is later.

In short, if you don’t challenge the document in accordance with this rule, you are “deemed to admit the authenticity” of the document. There may be circumstances where this rule does not form a complete barrier to certain challenges, but that is beyond the scope of this article.

In Redstone Mortgages Ltd v B Legal Ltd [2014] EWHC 3390 (Ch), Mr Justice Norris explained what happens when a document’s authenticity is challenged:

“57.  Requiring a party to “prove” a document means that the party relying upon the document must lead apparently credible evidence of sufficient weight that the document is what it purports to be. The question then is whether (in the light of that evidence and in the absence of any evidence to the contrary effect being adduced by the party challenging the document) the party bearing the burden of proof in the action has established its case on the balance of probabilities. Redstone cannot (by a refusal to admit the authenticity of a document) transfer the overall burden of proof onto B Legal, any more than it could do so simply by refusing to admit a fact.

58.  The question is therefore whether any evidence as to the provenance of the document has been produced, and if it has then whether (although not countered by any evidence to the contrary) such evidence is on its face so unsatisfactory as to be incapable of belief. It is vital that the process of challenge is fair. Criticism of the evidence about the authenticity of the document cannot amount to a covert and unpleaded case of forgery. If a case of forgery is to be put then the challenge should be set out fairly and squarely on the pleadings (and appropriate directions can be given)…”

The decision distinguishes between a simple challenge to authenticity and a direct allegation of forgery. This distinction is also relevant to deep fakes because not every challenge to authenticity equates to claiming another party has committed deliberate fraud or forgery. For further consideration of this point, see the discussion by Mr Justice Mellor in Crypto Open Patent Alliance v Wright [2023] EWHC 2642 (Ch), available on BAILII here., where from paragraph 46 he analyses “Authenticity Challenges vs Forgery Allegations”.

Comment

In my opinion, the real consequences of deep fakes in civil litigation haven’t fully surfaced yet, but that moment is approaching quickly. The existence of deep fakes in civil litigation threatens the reliability of evidence, as even careful observers can be deceived by expertly fabricated documents or recordings. Serving a notice to prove under CPR 32.19 can play a critical role in deterring attempts to rely on falsified documents and in prompting courts to scrutinise the origins, metadata, or supporting evidence for a document or recording.

Some final important points on deep fakes in civil litigation:

  1. Lawyers must be vigilant and be ready to challenge potential deep fakes in civil litigation. If this is missed, or the notice to challenge is not served in accordance with the CPR, then lawyers may face allegations of negligence.
  2. It is unclear how people will prove the veracity of documents in this new landscape. Once a notice to prove is served, the proponent of the document must lay out credible evidence of its authenticity, but this may now require expert evidence. Parties potentially need to engage forensic experts, such as digital analysts, who can investigate metadata or detect signs of manipulation in videos, images, or text, but this will depend on the context. The cost of such an exercise are likely to be significant.
  3. I think it is time for CPR 32 to be completely revisited in light of deep fakes in civil litigation. The court simply does not have rules tailored specifically to address the complexities of modern technology. In particular, questions around who bears the burden of proof, the role of forensic experts, and the protocol for handling suspicion of digital manipulation need clearer guidance. I will revisit this topic on its own shortly.

Finally, remember that challenging authenticity is not the same as alleging forgery. If you wish to make a direct accusation of forgery, which you may well consider if you suspect a deliberate deep fake, that must be clearly and carefully pleaded in line with professional obligations. This allows both sides to engage in a fair process to present evidence and argument.

What are your thoughts on this issue? Do current civil procedure rules adequately protect against the emerging threat of deep fakes in civil litigation, or is a fundamental update urgently needed? The discussion continues on LinkedIn and my Substack and don’t forget to subscribe to my newsletter here for more insights and updates on emerging AI legal challenges.