Close up of judge's gavel

Artificial intelligence

Why lawyers still matter in the age of AI

Slater and Gordon’s family legal team explore the case of ‘Ayinde, R v The London Borough of Haringey’ and the implications that AI caused during the case’s legal proceedings.


27 May 2025

Gradually, there are an increase of clients who are choosing to handle aspects of their case themselves, whether for practical or financial reasons. Some look to minimise costs, where others want more control over the legal process.

With AI tools offering quick answers and “remote legal assistance”, it is now easier than ever to feel confident navigating legal issues alone, without the need for a lawyer.

However, what has come to light from the recent High Court case of R v Ayinde v The London Borough of Haringey shows how things can quickly unravel when legal processes are not followed properly.

This case reminds us that while technology is a useful tool, it cannot replace the guidance, judgment and responsibility that comes with qualified legal representation. In reflection of the case, it holds valuable lessons for both clients and practitioners on the limitations AI has during legal proceedings.

What was the case presented in ‘Ayinde, R v The London Borough of Haringey’?

The case of Ayinde began as a judicial review claim but ended with a High Court judge raising serious concerns about relying on AI-generated responses in legal proceedings and referred the matter to professional regulators.

During the proceedings, submissions prepared by a barrister, instructed by Haringey Law Centre, cited five legal authorities. But the Defendant flagged something erroneous: none of the five cases appeared to exist. When challenged, the solicitor on record, described the citations as “cosmetic errors” that did not require an explanation. The judge called this response “remarkable”.

Mr Justice Ritchie found that, by failing to properly check the authorities, the legal team had knowingly misled the court by including what turned out to be fake case law. He also described it as “appalling professional misbehaviour,”

The Claimant’s legal team suggested that the barrister had mistakenly dragged a file into the bundle from a box of printed cases, or that these were “minor citation errors”. However, despite the several eyebrow-raising justifications, it was accepted that at least one of the cases was completely fake.

The judge noted that it would be negligent to rely on AI-generated text without verifying its accuracy. What matters here is the failure to check the material, and the team’s response when the issue came to light.

As the judge put it:

“The submission was a good one. The medical evidence was strong. The ground was potentially good. Why put a fake case in?”

Despite the Claimant eventually succeeding in the judicial review, the court reduced their costs award by £7,000 because of the lawyers’ inexcusable conduct. This was clearly an avoidable mistake, with serious professional consequences.

The importance of a qualified solicitor

The case of Ayinde shows that despite the merits of the case, the failure to verify AI generated information damages the credibility of the submissions, which causes a risk of negatively impacting the outcome. The potential advantages of the damages obtained are ultimately undermined by the imposed decrease of costs awarded.

By neglecting to verify AI-generated content, legal professionals not only risk damaging their professional standing but also compromise the integrity of the legal process.

AI can be useful with assisting on legal matters; however, the case emphasises the crucial role that human oversight through a qualified solicitor places in handling complicated legal matters. When AI is used improperly, it can cause monetary losses for both legal professionals and clients.

What are the risks when relying on AI navigate legal proceedings?

This leads us onto the question of whether AI-generated responses can be relied on for clients representing themselves. It may seem authoritative and reliable, but without thorough verification, AI can introduce critical inaccuracies.

For clients attempting to manage cases themselves, the risk is compounded. The errors of not verifying and understanding AI generated information can lead to flawed arguments, case mismanagement, costs consequences and be prejudicial to a parties case.

In Ayinde, the financial penalty was £7,000 illustrating how reliance on unverified AI can negate any initial cost savings.

This example shows the importance of human legal expertise and knowledge to ensure the accuracy, relevance, and integrity of legal submissions, protecting clients from the repercussions of improper use of AI.

What have we learnt about using AI in legal proceedings?

Whilst AI tools can enhance efficiency, the misuse of AI risks severe consequences for both professionals and clients. Ayinde highlights that human oversight remains critical in navigating legal complexities, reaffirming that technology is a supplement, not a substitute, for skilled legal expertise.

It is important that clients understand that cost saving shouldn’t come at the expense of reliability or using a qualified legal professional.

Talk to us today

If you need legal guidance, our solicitors at Slater and Gordon are on hand to provide you with support throughout the legal process. As an award-winning national firm, we are proud to have a team of fantastic lawyers with a depth of expertise, enabling you to find a solution with efficiency and reliability. For more information, find out more from our family law team here.

Search our website
Filter
Filter:
Sorry, we have no results to show
Please try a different search term.
Oops, something went wrong
Please try typing in your search again.
Back to top

Head over to our Scotland website

Visit Slater Gordon Scotland