When Court Data Disappears: Why Transparency in Family Courts Matters More Than Ever
In February 2026, the Ministry of Justice ordered the removal of a major archive of court listing data, citing data protection concerns and alleged misuse involving AI. On the surface, it looked like a dispute about compliance. In reality, it raises a far more serious question: what happens when the justice system becomes less visible? For families navigating private law disputes, safeguarding allegations and prolonged delay, transparency is not a political slogan — it is the difference between understanding how the system works and feeling powerless within it.
Key points (read this first)
- “Open justice” is not a vibe. It is a constitutional principle: the public must be able to see justice being done — in practice, not just in theory.
- The Courtsdesk database mattered because it made magistrates’ court activity discoverable at scale — across regions, trends and time — in a way ordinary listings often do not.
- The MoJ/HMCTS position has centred on data protection and alleged unauthorised sharing with an AI third party (including potentially sensitive identifiers). That is a serious issue — but it doesn’t automatically justify a “delete the archive” outcome.
- There is now a live policy tension: privacy compliance vs public scrutiny. The correct answer is not to pick one. It is to design lawful access with safeguards.
- AI changes the stakes. It can expose systemic court failures (delays, inconsistency, outcomes), but it can also amplify privacy harm if governance is weak.
- What to watch next: licensing frameworks, official listing portals, retention/archiving rules, and whether any independent oversight is built into the “new” regime.
If you only have 60 seconds: the question isn’t “should court data exist?” — it’s “who controls access, under what rules, with what accountability?”
When Court Data Disappears: Courtsdesk, the MoJ Deletion Order, and What “Open Justice” Means in the AI Age
By Jessica Susan Hill | Legal Consultant & McKenzie Friend | JSH Law Ltd
In February 2026, a story surfaced that should make every lawyer, journalist and court-user sit up: the Ministry of Justice (via HMCTS) instructed a private platform, Courtsdesk, to delete what was widely described as the UK’s largest archive of court reporting data. The dispute was framed as a data protection breach involving AI. Critics called it a major blow to open justice.
This isn’t a niche media row. It’s a governance problem with a constitutional wrapper. Because once court information becomes searchable at scale, it becomes auditable. And once the system becomes auditable, it becomes accountable.
1) What happened — and why the link you saw may have “stopped working”
If you clicked a share link to a paywalled newspaper, you’ll often get a broken experience (or a login wall). But the underlying issue is very real: in early-to-mid February 2026, multiple sources reported that the MoJ/HMCTS instructed Courtsdesk to remove court listing/archival data from its platform. The matter was then debated in Parliament, with ministers stating that action was taken because of data protection concerns and alleged unauthorised sharing with an AI company.
In the House of Commons debate on 10 February 2026, the government position was put bluntly: HMCTS stopped sharing data and instructed the company to remove data from its digital platform because the government considered personal data had been put at risk and/or shared in breach of agreement. (Hansard: “Court Reporting Data”). Read the Commons debate (Hansard).
The House of Lords revisited similar themes on 11 February 2026, referencing alleged sharing of “private, personal and legally sensitive information” with a third-party AI company, including potentially addresses and dates of birth of defendants and victims. Read the Lords debate (Hansard).
Meanwhile, journalist bodies and open justice advocates argued that the deletion demand would reduce practical visibility of magistrates’ courts — the engine room of criminal justice — and undermine reporting capacity nationwide. NUJ response (11 Feb 2026).
Subsequent coverage indicated that the government later paused the deletion/purge approach and explored alternative licensing or arrangements, following significant public pressure and campaigning (including within national media). One example: The Times: MoJ halts purge of court archive (published Feb 2026). (Paywalled, but relevant for context and sequence.)
2) What is Courtsdesk — and why journalists cared
Courtsdesk is typically described as a platform that made it easier for journalists to discover and track magistrates’ court hearings — and to keep a searchable archive of what had been listed. The word “archive” matters. Without it, reporting becomes a daily scramble: you can see “today’s” list (sometimes), but you cannot easily analyse what happened across a month, a year, or a decade, and you cannot robustly check what patterns repeat across courts.
That changes the reporting model. Instead of “we got a tip and attended a hearing”, journalists can ask structured questions like:
- Which courts are repeatedly listing the same offence type and outcome?
- Are there geographical disparities in sentencing outcomes (controlling for offence and prior record)?
- Is a particular safeguarding issue rising (domestic abuse, coercive control, breaches, stalking)?
- Are certain hearings routinely not listed, listed late, or listed inaccurately?
- Are “open” hearings being effectively closed by practical invisibility?
In short: a discoverable, searchable dataset turns open justice into something measurable. That is precisely why both open justice advocates and public interest reporters reacted so strongly.
For a short overview of the controversy as reported at the time: Legal Cheek (11 Feb 2026). For a more analytical legal-media perspective: Wiggin LLP commentary (16 Feb 2026).
3) The MoJ/HMCTS case: “data protection” and alleged sharing with AI
The government’s public position, as reflected in parliamentary statements, has been that data protection responsibilities were engaged. The allegation was not merely that the data existed, but that data was used or shared in a way that was not authorised by the relevant agreement — and that the information at issue could include sensitive personal identifiers.
In the Commons debate, MPs referenced the passing of information to an AI company, including addresses and dates of birth. You can read the relevant passages directly in Hansard: Court Reporting Data (Commons, 10 Feb 2026). The Lords debate similarly framed the core concern as sharing private/personal legally sensitive information with a third-party AI company: Court Reporting Data (Lords, 11 Feb 2026).
Let’s be clear: if victim or defendant identifiers were exposed or processed without a lawful basis, proper security, or appropriate contractual control, that is not a minor technicality. UK GDPR compliance is not optional — particularly where data could create direct risk (victim location, stalking risk, retaliation, intimidation, vigilante harm).
But there is a second question — and this is where policy and constitutional principles collide: even if a breach occurred, does the proportionate remedy have to be “delete the archive”? Or is the correct remedy:
- Stop the unauthorised processing,
- Investigate,
- Implement governance, redaction, licensing and audit controls,
- And preserve the public-interest value of the dataset?
In other regulated sectors, “burn the library” is rarely considered an intelligent response to a governance failure. You fix governance. You don’t erase institutional memory.
4) What “open justice” actually requires (and what it doesn’t)
“Open justice” is often described as a constitutional principle in common law: justice must be administered in public, with reporting permitted, because scrutiny is a safeguard against arbitrariness and abuse. It supports legitimacy and public confidence.
But open justice is not absolute. Courts can restrict reporting, anonymise parties, hold parts of hearings in private, or impose reporting restrictions where necessary and proportionate — especially to protect children, victims, national security, or the integrity of proceedings.
Here’s the practical point: open justice collapses when information is technically “available” but realistically undiscoverable. If court lists are incomplete, delayed, inaccurate, scattered, or accessible only through relationships and workarounds, then public scrutiny becomes selective and fragile.
A searchable archive changes the baseline. It doesn’t guarantee perfect scrutiny, but it makes scrutiny possible at scale.
The NUJ response captures the concern in direct terms: the state must take data protection seriously, but journalists are worried about the effect on their ability to do their job. NUJ: deletion order response.
5) The real issue: discoverability, not secrecy
Most people misunderstand how court reporting works. They think journalists can simply “look up” what is happening in court.
In practice, magistrates’ courts are high-volume. Hearings move. Lists change. Data may be published late, inconsistently, or in formats that are difficult to search. Court staff are under pressure. Press offices (where they exist) are stretched. The result is that what is formally “public” can become practically opaque.
So when people say “this undermines open justice,” they may not mean “the government is hiding a single case.” They mean: remove the infrastructure of discoverability and you reduce systemic scrutiny.
The wider concern is that once the system is not audited at scale, dysfunctional patterns persist:
- Overlisting and adjournment churn;
- Chronic delay;
- Inconsistent listing practices;
- Variable use of reporting restrictions;
- Localised cultures that drift without challenge.
This is where AI becomes relevant — not as hype, but as a tool. AI is exceptionally good at extracting patterns from messy, fragmented data. And patterns are exactly what the justice system needs to be forced to confront.
6) AI: the uncomfortable accelerator of accountability
Here is the uncomfortable truth: AI makes “open justice” more powerful, because it can transform raw listings and outcomes into insight:
- Where are outcomes diverging without explanation?
- Which courts are systematically underperforming on timeliness?
- Which offence types are rising or falling?
- Do bail decisions correlate with geography in ways that look unjustified?
- Are certain safeguarding concerns being deprioritised?
For the public, this can mean better scrutiny and informed reform. For institutions, it can feel like a loss of narrative control.
But AI also increases privacy risk. Aggregation is a form of power: data that is safe in one context can become dangerous in another when combined, enriched, or made searchable. That is why governance matters.
The question is not “AI or no AI.” It is: who is allowed to process court data with AI, under what licence, with what redaction, with what audit trail, and with what sanctions for misuse?
7) Data protection and open justice can coexist — if you design for both
If there was an unauthorised transfer of personal data to a third-party AI provider, that needs to be addressed. Strongly. But the correct fix is not necessarily deletion. The correct fix is a governance framework that takes seriously both:
- Lawful processing and security (UK GDPR; DPA 2018; contractual controls; access logs; DPIAs); and
- Open justice functions (discoverability; auditability; press access; public interest research).
A mature framework would include:
(A) Role-based access
Not everyone needs the same level of detail. A press-accredited journalist may need more than the general public. An academic researcher may need a structured dataset but not identifiers. A safety model is tiered access with clear rules.
(B) Default minimisation and redaction
Listings can be published in a way that is still meaningful but reduces harm: names may be necessary for open justice in many cases, but addresses and dates of birth generally aren’t. A “privacy by design” listing format is possible.
(C) Contractual control over processors
If AI tools are used, the relationship between controller and processor must be contractually controlled, audited, and limited. “Testing” is still processing. “Internal development” is still processing.
(D) Audit logs and sanctions
If a platform is given access to sensitive data, there must be a reliable audit trail and enforceable consequences for misuse.
This is the kind of approach the state should model. It’s what we demand of the private sector. The justice system should not be a governance laggard.
8) “Just use official channels” is not a sufficient answer
One argument raised in public discussion is that journalists can still access listings through official HMCTS channels, so the deletion of a private archive is not fatal.
Here’s the hard reality: official availability does not necessarily equal practical usability. The difference between:
- a fragmented set of daily lists, and
- a searchable, longitudinal archive
is the difference between “seeing a hearing” and “auditing a system”.
It’s the audit function that scares people — and it’s the audit function that reform needs.
For contemporaneous legal-sector analysis and a timeline-style overview, see: Wiggin LLP commentary.
9) The proportionality question: why “delete it” feels extreme
When government acts, it must act proportionately — especially when its actions collide with constitutional principles.
If the problem was a specific breach, a proportionate response normally looks like:
- Stop the unlawful processing immediately;
- Preserve evidence;
- Investigate scope and impact;
- Notify where legally required;
- Fix governance;
- Implement redaction and access controls;
- Resume service under a compliant licence.
Deleting a historic archive can be justified in certain cases — for example, if the archive itself is irredeemably unsafe and cannot be lawfully held. But that is a high threshold. And if that threshold is met, the next question is: why was the data shared in that form in the first place, and why was it not already governed appropriately?
Open justice is a public asset. When you destroy an archive that underpins scrutiny, you don’t merely “solve” a compliance problem — you erase a public accountability mechanism.
10) What this means for litigants, victims and the public
This is not only about journalists. It touches:
Victims and vulnerable witnesses
Privacy matters. Safety matters. If addresses/DoBs are handled recklessly, it can cause real-world harm. A governance regime must centre safeguarding and risk. The state is right to be strict about that.
Defendants
Defendants have rights too. Public identification can be lawful and appropriate in open court, but bulk data aggregation can create long-tail harm (employment, housing, vigilantism), particularly where cases end in acquittal or discontinuance. This is why minimisation and careful retention rules matter.
The public
The public interest in open justice is not abstract. It includes the ability to scrutinise how domestic abuse is treated, how repeat offenders are sentenced, how grooming cases are prosecuted, and whether systemic failures are being ignored.
The debate is often framed as “privacy vs transparency.” A better framing is: “privacy and transparency with engineering-grade governance.”
11) A practical blueprint for a lawful court data ecosystem
If we want open justice that survives the AI era, we need to stop improvising and start designing. Here is a blueprint that would satisfy most of the legitimate concerns on all sides:
- Define a canonical “public listing dataset” with minimised fields (no addresses; no full DoB; protect victims by default where appropriate).
- Publish in a consistent, machine-readable format so that “discoverability” is not dependent on private scraping or informal relationships.
- Implement a press and research licence with tiered access, clear contractual controls, audit logs, and enforcement.
- Create a secure research environment (think “data safe haven”) where higher-sensitivity data can be used for public-interest research under supervision.
- Mandate DPIAs for any new processing at scale, including any AI model training or automated analytics.
- Independent oversight: an external advisory panel including press, victims’ advocates, privacy experts and court users.
If you work in legal ops, you’ll recognise this: it is the same control architecture we use for health data, financial data, and regulated client data. The justice system deserves no less.
12) What you can do if you care about this
- Read the parliamentary record and compare the stated rationale with the real-world impact: Commons Hansard (10 Feb 2026) and Lords Hansard (11 Feb 2026).
- Track journalist-body positions (NUJ is a good start): NUJ statement.
- Ask the right question of policymakers: “What is the new lawful access model — and who is responsible for ensuring discoverability in practice?”
- Watch for licensing/market engagement notices and consultation opportunities. (Legal commentary sites often summarise these quickly.)
- If you are a court user or practitioner, keep records. Transparency is partly built from bottom-up documentation — hearing notices, listings, orders, reasons, and procedural history.
Because here is the punchline: if the system cannot be seen, it cannot be improved. And if it cannot be improved, it cannot be trusted.
Sources and further reading
- UK Parliament (Hansard) — Commons debate, 10 Feb 2026: Court Reporting Data
- UK Parliament (Hansard) — Lords debate, 11 Feb 2026: Court Reporting Data
- National Union of Journalists (NUJ), 11 Feb 2026: NUJ responds to deletion order
- Wiggin LLP commentary, 16 Feb 2026: Open Justice: MoJ closes court reporting archive
- Legal Cheek, 11 Feb 2026: MoJ orders deletion of court reporting database
- The Times (paywalled), Feb 2026: MoJ halts purge of court archive




