Seven key insights: Lord Justice Birss considers AI in civil justice
Matthew Lee sets out what expert witnesses and housing lawyers must know from the views on AI expressed by the senior judge who is to become Chancellor of the High Court later this year.
- Details
“...I will explain it by reference to the judicial guidance we have promulgated on AI. This emphasises that individual judges take full personal responsibility for what goes out in their name. So, just like a Google search on the internet, AI also can be handy for reminding you of something you knew but had forgotten, but also just like googling, AI is a poor and unsafe tool for finding the answer to a question you do not yourself know and cannot recognise as correct...”
Lord Justice Birss
Introduction
This has been a big week for AI in civil justice. Yesterday, I addressed some key IP cases, but two more notable developments have emerged, both involving Lord Justice Birss, a figure regular readers will know I frequently highlight for his enthusiasm and expertise in AI and technology. The first development concerns judicial appointments, and the second covers intriguing advancements in the use of technology and AI in litigation, particularly regarding expert evidence.
Judicial appointments
The judiciary has announced significant new appointments to the Court of Appeal, confirming the elevation of six experienced High Court judges. Mrs Justice Sara Cockerill, Mr Justice Ian Dove, Mr Justice Richard Foxton, Mrs Justice Nicola May, Mr Justice Nicholas Miles, and Mrs Justice Julia Yip have all been appointed as Lords and Lady Justices of Appeal.
In addition, and particularly relevant to the themes regularly explored on this blog, Lord Justice Birss was appointed Chancellor of the High Court, effective from 1 November 2025. He succeeds Sir Julian Flaux, who will retire from the role. The announcement further explains his role in AI in civil justice:
“He will bring a wealth of judicial experience to the position, including the valuable contributions that he has made since becoming Deputy Head of Civil Justice and Lead Judge for Artificial Intelligence.”
Lord Justice Birss’s previous insights on AI
Regular readers of my blog will recall Lord Justice Birss’s previous insights on AI in civil justice and other areas of law, which can be read here:
- AI Judicial Assistants and AI Judges? Observations by Lord Justice Birss
- Digital Justice: Can AI Transform the County Courts of England and Wales?
Lord Justice Birss’s earlier observations on AI Judges and Judicial Assistants have prompted considerable discussion among those interested in AI and I often cite the below as an interesting essay topic.
“Looking further into the future, one could imagine that AI may very well be able to assimilate much larger quantities of data than a normal human judge. One could then be faced with the situation in which an AI system might be a better decision maker than a human being in those circumstances. I should make clear that I do not believe we’re anywhere near that yet, but from what I read in the literature of the capabilities of AI, I would not like to bet against the idea that this capability will arrive in a not-too-distant future.“
Lord Justice Birss’s more recent observations
On 20 June 2025, Lord Justice Birss delivered a closing keynote titled ““Expert Witnesses – Vital Participants in Civil Justice” at the Expert Witness Institute (EWI) Annual Conference. Drawing from his substantial experience, first as a barrister in the Patents Court, then as a judge dealing with personal injury and clinical negligence, he emphasised the crucial role expert witnesses play in helping courts navigate complex matters.
He highlighted the importance of Civil Procedure Rules (CPR) 35.1 and 35.3 which are rules designed to keep expert evidence essential and objective. He further addressed common challenges around managing costs, complexity, and effective scrutiny of expert evidence. These are crucial issues for anyone dealing with expert evidence and I would suggest the keynote is read in full by those practicing in areas heavily focused on expert evidence.
However, it’s his points on AI and technology that stood out to me. There were at least seven important observations.
1. What happens when judges/experts use AI in Civil Justice or otherwise?
Judges bear personal responsibility for content issued in their names:
“…I will explain it by reference to the judicial guidance we have promulgated on AI. This emphasises that individual judges take full personal responsibility for what goes out in their name. So, just like a Google search on the internet, AI also can be handy for reminding you of something you knew but had forgotten, but also just like googling, AI is a poor and unsafe tool for finding the answer to a question you do not yourself know and cannot recognise as correct…”
2. No excuse to blame AI
As we have seen in the recent hallucination cases analysis, blaming AI for a mistake is not going to be a valid excuse:
“…We all make mistakes but blaming AI for a mistake in your expert’s report is no excuse..”
3. AI already established in disclosure. Can humans realistically grapple with the data sets?
The task of going through documents in disclosure is becoming increasingly difficult. We may not be at a point where it is not realistic to expert humans to carry out this task without assistance:
“An area of legal practice in which AI in already well established – and judicially endorsed – is in the disclosure process for very large data sets. So called Technology Assisted Review (TAR) has been around for a decade at least. Using TAR, experienced lawyers train the machine learning model by example, by identifying a small set of relevant documents, then the machine trawls through the whole data set with the aim of sorting out the wheat from the chaff. I mention this to make two points.
One it is an example to show that used properly, AI has significant potential. The other is to highlight the point that the volumes of material which we have to grapple with in civil cases is only going in one direction. What with email, messaging systems like WhatsApp, AI driven transcripts of Teams meetings and the like, we are close to the point that it is not realistic to expect human beings to grapple with these large data sets unaided. We may even have passed it.”
4. Does AI use by experts need disclosure in Civil Justice or otherwise?
I recently outlined some recent international cases on AI in civil justice and expert evidence giving my thoughts in conclusion, so I was particularly interested to read the below.
“So when one is asked to give an opinion as an expert based on the material in a case, it may be necessary, not to say inevitable to use modern technology to identify the particular documents to grapple with. Does that use of AI need to be disclosed? Well if that really is all that the technology is being used for – as a tool to fish out potentially relevant material from disclosure already given by the other party– then the answer is no. No more than one would laboriously explain that a word search was carried out to find something. But of course, if a point was advanced which made the method of searching itself relevant – such as a statement that these are the only documents in the collection which are important – then the technique used would matter and would need to be disclosed. That is, I hope, a statement of the obvious.”
5. Guidance on AI in court documents
On the need for clear guidelines around using AI in court documents:
“…But just as we have judicial guidance, so there might be need for similar guidance for experts. And to come back to the CJC for a moment, we also have a separate strand of work now underway looking whether there need to be rules relating to the use of AI in the preparation of court documents – like pleadings, witness statements and expert’s reports…”
6. Online Procedure Rule Committee (OPRC)
Lord Justice Birss announced the establishment of the OPRC, aiming to improve justice accessibility through digital systems, starting notably in property and possession cases.
“Now finally I want to mention two recent developments which you might wish to know about and contribute to. First, we now have the Online Procedure Rule Committee. It has a broad remit, potentially across the whole of the civil, family and tribunals justice systems, to improve access to justice in that sphere by harnessing the potential of digital systems.
The first SI has been passed which gives the OPRC a specific remit in property and possession – and work is underway with HMCTS on the new digital system for these cases – but work is also going on to look at how to transform the space before cases come to court, by having a set of overarching principles and promoting interoperability of the systems in that sphere by the use of data standards. I would expect expert witnesses to have a role in that area – not just I the context of expert determinations but perhaps in other ways too. We know for example that pre-action medical reports play a key role in personal injury cases.”
7. Pilot for digital access to court documents
A further important mention in AI and civil litigation:
“And there is to be a new pilot – hopefully starting in certain Business and Property Courts in the Rolls Building, such as the Commercial Court – in which a new approach to transparency of certain documents it taken. You are aware that many documents such as experts report can be made available to the public once deployed in court– albeit the process in the CPR is relatively cumbersome. The idea of this pilot – subject to ministerial approval as all rule changes are – is to use digital technology to make some of these documents more readily available – thereby improving the transparency of the proceedings, which is a fundamental of our justice system. It will be very interesting to see how that develops. So there is a lot going on and. as I said at the start – you – expert witnesses – are indeed vital participants on the process of justice. I am looking forward to working with you in future on some of these projects.”
Comment
Several points raised here warrant further exploration, but two particularly stand out and deserve dedicated blog posts.
Firstly, the developments involving the Online Procedure Rule Committee (OPRC) in property and possession claims caught my attention. Given my professional experience in these areas, I’m especially interested to understand exactly how these initiatives will enhance access to justice for litigants. I’m also keen to explore precisely how this might integrate AI in civil justice (if at all).
Secondly, the ongoing consideration of rules governing the use of AI in preparing court documents, such as pleadings, witness statements, and expert reports etc is intriguing. Those who work with me know that I am not a fan of the written witness statement (in most contexts) and I’m worried AI will only make matters worse, but I’ll explore that further in a dedicated post, hopefully later this week.
For now, it’s worth reflecting on whether judicial opinion is shifting from a relatively permissive stance on AI to a more cautious approach. This point is especially relevant in light of earlier comments by the Master of the Rolls, which readers might recall from my previous post [here]. At that time, the Master of the Rolls compared the approach in England and Wales with that in New South Wales, Australia, observing:
“It will be interesting to see how that more restrictive approach in New South Wales works out as compared to our approach. I would comment, though, that AI is already being used in many jurisdictions for some of the purposes that the NSW guidance says it should not be. I doubt we will be able to turn back the tide. Our guidance is within the grain of current usage, making clear that lawyers are 100% responsible for all their output, AI-generated or not.”
Given recent developments around AI hallucinations and the explicit warnings issued to legal professionals and regulatory bodies, it would be valuable to know whether the Judiciary is revisiting its permissive approach. Should we, as a jurisdiction, be considering a shift towards the more restrictive Australian (NSW) model? Personally, I feel such a shift would be unfortunate. However, I acknowledge the validity of concerns raised internationally regarding AI hallucinations. It may be that urgent steps are indeed required, but I do think the profession is taking careful note of the judicial warnings and people are certainly more alive to the risks.
I look forward to discussing these points in more detail. Do reach out if you have any interesting observations on these topics.
Matthew Lee is a barrister at Doughty Street Chambers.
This article first appeared on Matthew’s Natural and Artificial Intelligence in Law blog. It is part of his broader legal commentary, available through his Substack newsletter. Please subscribe here.
Senior Lawyer - Advocate
Head of Legal Shared Service
Head of Governance & University Solicitor
Director of Legal and Governance (Monitoring Officer)
Poll
05-08-2025 10:00 am
18-08-2025 10:00 am