The Use of AI in Preparing Court Documents: Why the Civil Justice Council Consultation Matters

The Civil Justice Council has launched an eight-week consultation examining whether new rules are needed to regulate the use of artificial intelligence in preparing court documents. Chaired by Lord Justice Birss, the Working Group is considering whether safeguards or formal declarations should apply when legal representatives use AI to draft pleadings, witness statements and expert reports. The consultation recognises both the efficiency benefits of AI and the risks of hallucinated case citations, fabricated authorities and evidential integrity concerns. Particular focus is placed on witness statements and expert evidence, where authenticity is central to the administration of justice. The consultation closes on 14 April 2026. This article explains what is being proposed, why it matters for litigants in person and legal professionals, and how responsible AI use can strengthen — rather than undermine — credibility in court proceedings. PDF here.

The Use of AI in Preparing Court Documents: Why the Civil Justice Council Consultation Matters

Category: AI & Law / Procedural Updates  |  Audience: Litigants in Person & Legal Professionals (England & Wales)

Key takeaways for litigants in person

  • The Civil Justice Council (CJC) is consulting on whether rules should govern the use of AI in preparing court documents.
  • The consultation closes on 14 April 2026.
  • Proposals include possible declarations where AI has been used to generate substantive content.
  • Administrative uses (spell-check, transcription, formatting) are unlikely to require disclosure.
  • Witness statements and expert reports are likely to face stricter safeguards.

What Is This Consultation About?

The Civil Justice Council (CJC) has published an Interim Report and opened an eight-week consultation examining whether procedural rules are needed to regulate the use of artificial intelligence in preparing court documents.

The Working Group is chaired by Lord Justice Birss and includes members of the judiciary, the Bar Council, the Law Society and academic representatives.

The core question is simple but significant:

Should formal rules govern how legal representatives use AI when preparing pleadings, witness statements, skeleton arguments and expert reports?

The consultation paper explains that AI has enormous potential benefits — but also significant risks, particularly around hallucinated case citations, fabricated material and evidential integrity.

Why This Matters

AI is already being used across the legal sector for:

  • Legal research
  • Drafting pleadings
  • Preparing skeleton arguments
  • Summarising disclosure
  • Drafting witness statements
  • Generating expert reports

The consultation recognises that while AI improves efficiency and access to justice, it also introduces risks including:

  • Hallucinated case citations
  • Invented legal authorities
  • Embedded bias in generated content
  • Deepfake or manipulated evidence
  • Hidden metadata (“white text”) manipulation

The administration of justice depends on reliability. If courts cannot trust documents filed before them, confidence in the system erodes.

What the Working Group Proposes

The consultation distinguishes between:

  • Administrative uses (spell-check, formatting, transcription, accessibility tools)
  • Substantive generative uses (AI drafting legal argument, evidence, or expert analysis)

The Working Group’s emerging position suggests:

  • No additional rule required for statements of case or skeleton arguments, provided a legal professional takes responsibility.
  • Stricter controls for witness statements, particularly trial statements.
  • Possible declarations confirming AI has not generated witness evidence.
  • Amendments to expert report statements of truth to require disclosure of AI use.

Witness Statements: The Most Sensitive Area

The report strongly indicates that generative AI should not be used to create or alter substantive witness evidence.

The concern is straightforward:

  • Witness statements must be in the witness’s own words.
  • AI “improving” phrasing may alter tone, emphasis or meaning.
  • Courts rely heavily on authenticity.

The Working Group proposes a declaration that AI has not been used to generate, embellish or rephrase evidence in trial witness statements.

That is significant. It signals that evidential integrity is where regulation will likely concentrate.

Expert Reports: Transparency Rather Than Prohibition

Unlike witness statements, expert reports may legitimately use AI tools for:

  • Data analysis
  • Document extraction
  • Technical modelling

However, the consultation proposes that experts should disclose and explain any AI use beyond administrative functions.

The aim is transparency — not prohibition.

What About Litigants in Person?

Notably, this consultation does not focus on regulating litigants in person.

The paper recognises that many unrepresented parties may rely on AI as their only accessible form of legal assistance.

That presents a policy tension:

  • AI can improve access to justice.
  • But AI can generate inaccuracies.
  • Litigants may lack the expertise to verify output.

Any regulation must therefore balance fairness with accessibility.

Should There Be Mandatory AI Declarations?

International approaches vary. Some US courts require certification of AI use. Others do not.

The Working Group is cautious. It recognises that:

  • AI is rapidly integrating into legal software.
  • It may soon be impossible to distinguish “AI use”.
  • Over-regulation may increase delay and satellite litigation.

The likely direction appears to be:

  • No blanket declaration for routine drafting.
  • Targeted safeguards for evidence.
  • Clear professional responsibility.

Why This Consultation Is Forward-Looking

AI is not going away. The question is not whether it will be used — but how responsibly.

The consultation reflects a mature approach:

  • Encourage innovation.
  • Protect evidential integrity.
  • Preserve public confidence.
  • Avoid stifling access to justice.

That balance is critical.

How to Respond to the Consultation

The consultation closes on 14 April 2026.

Responses can be submitted by completing the consultation cover sheet and sending it to:

CJC.AI.consultation@judiciary.uk

Questions about the process can be directed to:

CJC@judiciary.uk

Responses may be submitted in Word or PDF format.

What This Means Practically

If you are preparing court documents using AI:

  • Verify all case citations manually.
  • Check statutory references independently.
  • Do not use AI to generate witness evidence.
  • Retain responsibility for every word filed.

AI is a tool. It is not a shield.

A Realistic Perspective

Used responsibly, AI enhances efficiency. Used carelessly, it damages credibility.

The Civil Justice Council is not proposing a ban. It is seeking proportionate governance.

That distinction matters.


Book a 15-minute consultation (phone)

If you are navigating litigation and considering using AI tools, or if you are concerned about AI-generated material in your case, you can book a 15-minute consultation below:

Technology should strengthen your case — not undermine it.


Regulatory & Editorial Notice

This article provides general commentary only and does not constitute legal advice. JSH Law provides litigation support services to litigants in person and does not conduct reserved legal activities. References to consultation materials are for informational purposes only.

You can download the pdf here : Interim-Report-and-Consultation-Use-of-AI-for-Preparing-Court-Documents-2.pdf

0 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *