Posts

When Court Data Disappears: Why Transparency in Family Courts Matters More Than Ever

In February 2026, the Ministry of Justice ordered the removal of a major archive of court listing data, citing data protection concerns and alleged misuse involving AI. On the surface, it looked like a dispute about compliance. In reality, it raises a far more serious question: what happens when the justice system becomes less visible? For families navigating private law disputes, safeguarding allegations and prolonged delay, transparency is not a political slogan — it is the difference between understanding how the system works and feeling powerless within it.

Key points (read this first)

  • “Open justice” is not a vibe. It is a constitutional principle: the public must be able to see justice being done — in practice, not just in theory.
  • The Courtsdesk database mattered because it made magistrates’ court activity discoverable at scale — across regions, trends and time — in a way ordinary listings often do not.
  • The MoJ/HMCTS position has centred on data protection and alleged unauthorised sharing with an AI third party (including potentially sensitive identifiers). That is a serious issue — but it doesn’t automatically justify a “delete the archive” outcome.
  • There is now a live policy tension: privacy compliance vs public scrutiny. The correct answer is not to pick one. It is to design lawful access with safeguards.
  • AI changes the stakes. It can expose systemic court failures (delays, inconsistency, outcomes), but it can also amplify privacy harm if governance is weak.
  • What to watch next: licensing frameworks, official listing portals, retention/archiving rules, and whether any independent oversight is built into the “new” regime.

If you only have 60 seconds: the question isn’t “should court data exist?” — it’s “who controls access, under what rules, with what accountability?”

When Court Data Disappears: Courtsdesk, the MoJ Deletion Order, and What “Open Justice” Means in the AI Age

By Jessica Susan Hill | Legal Consultant & McKenzie Friend | JSH Law Ltd

In February 2026, a story surfaced that should make every lawyer, journalist and court-user sit up: the Ministry of Justice (via HMCTS) instructed a private platform, Courtsdesk, to delete what was widely described as the UK’s largest archive of court reporting data. The dispute was framed as a data protection breach involving AI. Critics called it a major blow to open justice.

This isn’t a niche media row. It’s a governance problem with a constitutional wrapper. Because once court information becomes searchable at scale, it becomes auditable. And once the system becomes auditable, it becomes accountable.

1) What happened — and why the link you saw may have “stopped working”

If you clicked a share link to a paywalled newspaper, you’ll often get a broken experience (or a login wall). But the underlying issue is very real: in early-to-mid February 2026, multiple sources reported that the MoJ/HMCTS instructed Courtsdesk to remove court listing/archival data from its platform. The matter was then debated in Parliament, with ministers stating that action was taken because of data protection concerns and alleged unauthorised sharing with an AI company.

In the House of Commons debate on 10 February 2026, the government position was put bluntly: HMCTS stopped sharing data and instructed the company to remove data from its digital platform because the government considered personal data had been put at risk and/or shared in breach of agreement. (Hansard: “Court Reporting Data”). Read the Commons debate (Hansard).

The House of Lords revisited similar themes on 11 February 2026, referencing alleged sharing of “private, personal and legally sensitive information” with a third-party AI company, including potentially addresses and dates of birth of defendants and victims. Read the Lords debate (Hansard).

Meanwhile, journalist bodies and open justice advocates argued that the deletion demand would reduce practical visibility of magistrates’ courts — the engine room of criminal justice — and undermine reporting capacity nationwide. NUJ response (11 Feb 2026).

Subsequent coverage indicated that the government later paused the deletion/purge approach and explored alternative licensing or arrangements, following significant public pressure and campaigning (including within national media). One example: The Times: MoJ halts purge of court archive (published Feb 2026). (Paywalled, but relevant for context and sequence.)

2) What is Courtsdesk — and why journalists cared

Courtsdesk is typically described as a platform that made it easier for journalists to discover and track magistrates’ court hearings — and to keep a searchable archive of what had been listed. The word “archive” matters. Without it, reporting becomes a daily scramble: you can see “today’s” list (sometimes), but you cannot easily analyse what happened across a month, a year, or a decade, and you cannot robustly check what patterns repeat across courts.

That changes the reporting model. Instead of “we got a tip and attended a hearing”, journalists can ask structured questions like:

  • Which courts are repeatedly listing the same offence type and outcome?
  • Are there geographical disparities in sentencing outcomes (controlling for offence and prior record)?
  • Is a particular safeguarding issue rising (domestic abuse, coercive control, breaches, stalking)?
  • Are certain hearings routinely not listed, listed late, or listed inaccurately?
  • Are “open” hearings being effectively closed by practical invisibility?

In short: a discoverable, searchable dataset turns open justice into something measurable. That is precisely why both open justice advocates and public interest reporters reacted so strongly.

For a short overview of the controversy as reported at the time: Legal Cheek (11 Feb 2026). For a more analytical legal-media perspective: Wiggin LLP commentary (16 Feb 2026).

3) The MoJ/HMCTS case: “data protection” and alleged sharing with AI

The government’s public position, as reflected in parliamentary statements, has been that data protection responsibilities were engaged. The allegation was not merely that the data existed, but that data was used or shared in a way that was not authorised by the relevant agreement — and that the information at issue could include sensitive personal identifiers.

In the Commons debate, MPs referenced the passing of information to an AI company, including addresses and dates of birth. You can read the relevant passages directly in Hansard: Court Reporting Data (Commons, 10 Feb 2026). The Lords debate similarly framed the core concern as sharing private/personal legally sensitive information with a third-party AI company: Court Reporting Data (Lords, 11 Feb 2026).

Let’s be clear: if victim or defendant identifiers were exposed or processed without a lawful basis, proper security, or appropriate contractual control, that is not a minor technicality. UK GDPR compliance is not optional — particularly where data could create direct risk (victim location, stalking risk, retaliation, intimidation, vigilante harm).

But there is a second question — and this is where policy and constitutional principles collide: even if a breach occurred, does the proportionate remedy have to be “delete the archive”? Or is the correct remedy:

  • Stop the unauthorised processing,
  • Investigate,
  • Implement governance, redaction, licensing and audit controls,
  • And preserve the public-interest value of the dataset?

In other regulated sectors, “burn the library” is rarely considered an intelligent response to a governance failure. You fix governance. You don’t erase institutional memory.

4) What “open justice” actually requires (and what it doesn’t)

“Open justice” is often described as a constitutional principle in common law: justice must be administered in public, with reporting permitted, because scrutiny is a safeguard against arbitrariness and abuse. It supports legitimacy and public confidence.

But open justice is not absolute. Courts can restrict reporting, anonymise parties, hold parts of hearings in private, or impose reporting restrictions where necessary and proportionate — especially to protect children, victims, national security, or the integrity of proceedings.

Here’s the practical point: open justice collapses when information is technically “available” but realistically undiscoverable. If court lists are incomplete, delayed, inaccurate, scattered, or accessible only through relationships and workarounds, then public scrutiny becomes selective and fragile.

A searchable archive changes the baseline. It doesn’t guarantee perfect scrutiny, but it makes scrutiny possible at scale.

The NUJ response captures the concern in direct terms: the state must take data protection seriously, but journalists are worried about the effect on their ability to do their job. NUJ: deletion order response.

5) The real issue: discoverability, not secrecy

Most people misunderstand how court reporting works. They think journalists can simply “look up” what is happening in court.

In practice, magistrates’ courts are high-volume. Hearings move. Lists change. Data may be published late, inconsistently, or in formats that are difficult to search. Court staff are under pressure. Press offices (where they exist) are stretched. The result is that what is formally “public” can become practically opaque.

So when people say “this undermines open justice,” they may not mean “the government is hiding a single case.” They mean: remove the infrastructure of discoverability and you reduce systemic scrutiny.

The wider concern is that once the system is not audited at scale, dysfunctional patterns persist:

  • Overlisting and adjournment churn;
  • Chronic delay;
  • Inconsistent listing practices;
  • Variable use of reporting restrictions;
  • Localised cultures that drift without challenge.

This is where AI becomes relevant — not as hype, but as a tool. AI is exceptionally good at extracting patterns from messy, fragmented data. And patterns are exactly what the justice system needs to be forced to confront.

6) AI: the uncomfortable accelerator of accountability

Here is the uncomfortable truth: AI makes “open justice” more powerful, because it can transform raw listings and outcomes into insight:

  • Where are outcomes diverging without explanation?
  • Which courts are systematically underperforming on timeliness?
  • Which offence types are rising or falling?
  • Do bail decisions correlate with geography in ways that look unjustified?
  • Are certain safeguarding concerns being deprioritised?

For the public, this can mean better scrutiny and informed reform. For institutions, it can feel like a loss of narrative control.

But AI also increases privacy risk. Aggregation is a form of power: data that is safe in one context can become dangerous in another when combined, enriched, or made searchable. That is why governance matters.

The question is not “AI or no AI.” It is: who is allowed to process court data with AI, under what licence, with what redaction, with what audit trail, and with what sanctions for misuse?

7) Data protection and open justice can coexist — if you design for both

If there was an unauthorised transfer of personal data to a third-party AI provider, that needs to be addressed. Strongly. But the correct fix is not necessarily deletion. The correct fix is a governance framework that takes seriously both:

  1. Lawful processing and security (UK GDPR; DPA 2018; contractual controls; access logs; DPIAs); and
  2. Open justice functions (discoverability; auditability; press access; public interest research).

A mature framework would include:

(A) Role-based access

Not everyone needs the same level of detail. A press-accredited journalist may need more than the general public. An academic researcher may need a structured dataset but not identifiers. A safety model is tiered access with clear rules.

(B) Default minimisation and redaction

Listings can be published in a way that is still meaningful but reduces harm: names may be necessary for open justice in many cases, but addresses and dates of birth generally aren’t. A “privacy by design” listing format is possible.

(C) Contractual control over processors

If AI tools are used, the relationship between controller and processor must be contractually controlled, audited, and limited. “Testing” is still processing. “Internal development” is still processing.

(D) Audit logs and sanctions

If a platform is given access to sensitive data, there must be a reliable audit trail and enforceable consequences for misuse.

This is the kind of approach the state should model. It’s what we demand of the private sector. The justice system should not be a governance laggard.

8) “Just use official channels” is not a sufficient answer

One argument raised in public discussion is that journalists can still access listings through official HMCTS channels, so the deletion of a private archive is not fatal.

Here’s the hard reality: official availability does not necessarily equal practical usability. The difference between:

  • a fragmented set of daily lists, and
  • a searchable, longitudinal archive

is the difference between “seeing a hearing” and “auditing a system”.

It’s the audit function that scares people — and it’s the audit function that reform needs.

For contemporaneous legal-sector analysis and a timeline-style overview, see: Wiggin LLP commentary.

9) The proportionality question: why “delete it” feels extreme

When government acts, it must act proportionately — especially when its actions collide with constitutional principles.

If the problem was a specific breach, a proportionate response normally looks like:

  • Stop the unlawful processing immediately;
  • Preserve evidence;
  • Investigate scope and impact;
  • Notify where legally required;
  • Fix governance;
  • Implement redaction and access controls;
  • Resume service under a compliant licence.

Deleting a historic archive can be justified in certain cases — for example, if the archive itself is irredeemably unsafe and cannot be lawfully held. But that is a high threshold. And if that threshold is met, the next question is: why was the data shared in that form in the first place, and why was it not already governed appropriately?

Open justice is a public asset. When you destroy an archive that underpins scrutiny, you don’t merely “solve” a compliance problem — you erase a public accountability mechanism.

10) What this means for litigants, victims and the public

This is not only about journalists. It touches:

Victims and vulnerable witnesses

Privacy matters. Safety matters. If addresses/DoBs are handled recklessly, it can cause real-world harm. A governance regime must centre safeguarding and risk. The state is right to be strict about that.

Defendants

Defendants have rights too. Public identification can be lawful and appropriate in open court, but bulk data aggregation can create long-tail harm (employment, housing, vigilantism), particularly where cases end in acquittal or discontinuance. This is why minimisation and careful retention rules matter.

The public

The public interest in open justice is not abstract. It includes the ability to scrutinise how domestic abuse is treated, how repeat offenders are sentenced, how grooming cases are prosecuted, and whether systemic failures are being ignored.

The debate is often framed as “privacy vs transparency.” A better framing is: “privacy and transparency with engineering-grade governance.”

11) A practical blueprint for a lawful court data ecosystem

If we want open justice that survives the AI era, we need to stop improvising and start designing. Here is a blueprint that would satisfy most of the legitimate concerns on all sides:

  1. Define a canonical “public listing dataset” with minimised fields (no addresses; no full DoB; protect victims by default where appropriate).
  2. Publish in a consistent, machine-readable format so that “discoverability” is not dependent on private scraping or informal relationships.
  3. Implement a press and research licence with tiered access, clear contractual controls, audit logs, and enforcement.
  4. Create a secure research environment (think “data safe haven”) where higher-sensitivity data can be used for public-interest research under supervision.
  5. Mandate DPIAs for any new processing at scale, including any AI model training or automated analytics.
  6. Independent oversight: an external advisory panel including press, victims’ advocates, privacy experts and court users.

If you work in legal ops, you’ll recognise this: it is the same control architecture we use for health data, financial data, and regulated client data. The justice system deserves no less.

12) What you can do if you care about this

  • Read the parliamentary record and compare the stated rationale with the real-world impact: Commons Hansard (10 Feb 2026) and Lords Hansard (11 Feb 2026).
  • Track journalist-body positions (NUJ is a good start): NUJ statement.
  • Ask the right question of policymakers: “What is the new lawful access model — and who is responsible for ensuring discoverability in practice?”
  • Watch for licensing/market engagement notices and consultation opportunities. (Legal commentary sites often summarise these quickly.)
  • If you are a court user or practitioner, keep records. Transparency is partly built from bottom-up documentation — hearing notices, listings, orders, reasons, and procedural history.

Because here is the punchline: if the system cannot be seen, it cannot be improved. And if it cannot be improved, it cannot be trusted.

Sources and further reading

Regulatory & Editorial Notice (JSH Law Ltd)

This article is published for general information and public-interest commentary only. It does not constitute legal advice and should not be relied upon as such. JSH Law Ltd is not a firm of solicitors and does not provide regulated legal services. If you require legal advice, you should consult a suitably qualified and regulated legal professional.

Where this article refers to third-party reporting, parliamentary materials, organisations, or public cases, it does so for journalistic, educational, and research purposes. External links are provided for reader convenience; JSH Law Ltd is not responsible for the content of external sites.

© JSH Law Ltd | Company No. 16870438 | Manchester (UK) & Kansas (USA)

Access to Justice Will Not Improve Until Litigants in Person Are Treated as First-Class Legal Tech Users

Access to Justice Will Not Improve Until Litigants in Person Are Treated as First-Class Legal Tech Users

Why courts, regulators, and legal-tech designers must stop building only for lawyers

“Access to justice” is one of the most repeated phrases in modern legal reform — and one of the least honestly examined in day-to-day court reality.

Across England and Wales, litigants in person (LiPs) now make up a significant proportion of users in family proceedings, civil disputes, tribunals and administrative processes. Yet much of the system — and much of legal tech — still assumes that a lawyer is the default user, and the unrepresented party is the exception.

They are not.

LiPs are a structural feature of the justice landscape. Until courts, regulators, and legal-tech providers explicitly recognise LiPs as first-class stakeholders, “access to justice” will remain aspirational rather than operational.

Key takeaways

  • Litigants in person are not marginal — they are central to how courts now function.
  • Legal tech designed only for lawyers often creates disadvantage for LiPs.
  • Courts can reduce chaos by setting clearer procedural standards and roadmaps.
  • Regulators can unlock innovation by clarifying the line between navigation support and legal advice.
  • Human-centred tools can improve compliance, fairness and efficiency without replacing lawyers.

1. The post-LASPO reality: LiPs are the system, not a problem within it

In a post-LASPO environment, it is common for one or both parties to be unrepresented. That reality increases pressure on judges, listing, court staff, and the opposing party (who may be represented). It also increases the risk of:

  • missed deadlines and procedural missteps
  • overlong or irrelevant bundles
  • adjournments and delay
  • hearings spent explaining process rather than determining issues
  • avoidable unfairness

These are not personal failings. They are predictable outcomes when systems are built around assumptions that no longer match real users.

2. Why most legal-tech tools fail litigants in person

Many tools that work well for professionals become actively unhelpful when applied to LiPs without redesign. Legal platforms typically assume users can:

  • interpret procedural stages and sequencing
  • identify which evidence is relevant (and why)
  • understand directions, service rules, and deadlines
  • use legal terminology accurately
  • separate emotion from issues and evidence

LiPs often cannot do those things consistently — not because they lack intelligence, but because the system is not taught, and the learning curve is steep under stress.

What this looks like in practice

When LiPs are unsupported, courts see repeat patterns: missed deadlines, misfiled documents, sprawling narratives, under-evidenced allegations, and confusion about what the court is deciding at each stage. These patterns are not random — they are design signals.

3. What courts must do: procedural clarity (not paternalism)

Courts are not powerless. A high-LiP environment requires courts to treat process design as part of justice delivery.

At minimum, courts should publish LiP-aware standards that clearly define:

  • core document types (e.g., chronology, statement, position statement, schedule of allegations/concerns where relevant)
  • what is needed at each stage (first hearing, directions, fact-finding, final hearing)
  • proportionality expectations for evidence and bundles
  • how to comply with directions and what happens if parties do not

Judges often explain process in court. The problem is inconsistency, stress, and the lack of a repeatable structure. Written roadmaps and standardised expectations reduce friction for everyone.

4. The regulator’s role: legitimising navigation tools without fear

One of the biggest barriers to LiP-focused legal tech is regulatory uncertainty. Developers and support services are often risk-averse because they fear crossing into “legal advice”.

Regulators can unlock responsible innovation by drawing a clearer line between:

  • procedural navigation (what the process is, what documents are, how to organise information, how to comply with directions), and
  • legal advice (what someone should do legally, the merits of their case, or how the court is likely to decide).

Navigation support vs legal advice (simple framework)

Usually safe procedural supportUsually crosses into legal advice
Process Explaining stages (e.g., directions → fact-finding → final hearing)
Compliance Helping track deadlines and service requirements
Organisation Structuring a chronology, index, exhibits, bundle sections
Plain English Translating court orders into clear tasks
Merits Advising whether someone should apply/oppose
Strategy Recommending what to plead or concede
Outcomes Predicting likely judicial findings/results
Representation Acting as if solicitor-client duties exist

5. What “LiP-first” legal tech actually looks like

LiP-centred legal tech does not have to be “AI giving legal advice”. The biggest gains come from tools that help people:

  • understand where they are in the process
  • know what is expected next
  • organise information coherently
  • comply with directions and deadlines
  • present evidence in proportionate, readable form

Simple flow diagram: How LiP-first tools reduce friction

Courts publish clear standardsDocument types, stage-by-stage roadmaps, proportionality, bundle structure.

Regulators clarify boundariesNavigation/compliance tools are legitimised; “legal advice” line is explicit.

Legal tech designs to the standardGuided workflows: timelines, bundles, checklists, deadlines, plain-English orders.

LiPs comply more easilyBetter documents, fewer adjournments, clearer issues, fairer hearings.

This is not about replacing lawyers. It’s about reducing avoidable failure points and making procedure intelligible.

6. Why co-design matters: building with, not for, litigants

The most credible way to improve tools for LiPs is co-design: courts, regulators, practitioners, support services, and litigants all informing the build. Without LiPs at the table, products will keep optimising for the wrong user — and courts will keep absorbing the cost.

7. The cost of doing nothing

When systems ignore their dominant user group, the impact is predictable:

  • longer hearings and heavier judicial case management
  • more procedural unfairness and inconsistent outcomes
  • greater emotional and financial harm (especially in family cases)
  • higher public cost through delay and repeat applications

LiP-first design is not only a fairness issue — it is a system efficiency issue.

8. A realistic path forward

Access to justice improves when:

  1. Courts set clear procedural standards and publish roadmaps designed for LiP reality.
  2. Regulators legitimise navigation and compliance tools, and make boundaries explicit.
  3. Legal-tech teams design for human understanding, not just professional efficiency.
  4. LiPs are treated as stakeholders in system design, not problems to be managed.

Call to action

If you are a litigant in person struggling with process — or you work in legal tech, policy, or court-facing innovation — this is a space where practical collaboration matters.

JSH Law works at the intersection of family justice, legal process, and responsible AI-assisted navigation, with a focus on making systems intelligible for real people (not just professionals).

  • Need help structuring a chronology, bundle, or evidence set?
  • Building LiP-centred tools and want practitioner input?
  • Want a repeatable workflow that improves compliance and reduces stress?

Get in touch via the contact page

Regulatory & Editorial Notice (JSH Law)
This article is published for general information and public legal education. It is not legal advice and should not be relied upon as such. Laws, procedural rules, guidance and practice may change. Where this article refers to third-party materials, organisations, or public-interest issues, those references are informational and do not imply endorsement. If you need advice on your specific circumstances, you should obtain independent legal advice from a regulated professional or appropriate support service.