The Role of AI in Litigation Law Firms
AI in litigation law firms is rapidly transforming how legal services are delivered. However, the 2025 case of Mazur & Stuart v Charles Russell Speechlys LLP highlights critical limits on AI use, especially in relation to the conduct of litigation and legal authorisation.
As artificial intelligence continues to reshape the legal profession, law firms must clearly define where AI assistance ends and formal legal responsibility begins. This distinction is essential to ensure compliance, accountability, and professional integrity.
1. The Case in Context
In this case, the High Court clarified a fundamental principle under the Legal Services Act 2007 (LSA):
being an employee of an SRA-authorised firm—or working under the supervision of an authorised solicitor—does not, by itself, entitle a person to conduct litigation.
The facts were straightforward. A debt claim was issued on behalf of the firm Charles Russell Speechlys LLP, but significant steps in the litigation were taken by a non-authorised employee without a practising certificate. The lower court initially accepted the arrangement and ordered costs against the defendants. However, on appeal, Sheldon J reversed this view, holding that:
- Section 21(3) of the LSA defines who is regulated, not who is authorised to conduct reserved activities.
- Only authorised or exempt persons (under sections 18 and 19/Schedule 3) may conduct litigation.
- Supervision is not a substitute for authorisation.
- The costs order (£10,653) was quashed, reaffirming the strict boundaries of fixed-costs rules in the Civil Procedure Rules (CPR Part 45).
2. Why This Matters for AI-Driven Law Firms
For law firms embracing automation, this case carries a clear warning:
AI systems are not “authorised persons” under the LSA.
That means any AI process that files, serves, or signs pleadings without an authorised solicitor’s approval may expose the firm to liability under sections 14 and 16 of the LSA, which criminalise unauthorised practice and employer complicity.
Simply having “AI supervised by a solicitor” is not enough. The authorised solicitor must personally approve, sign, and take accountability for every formal litigation step. AI may assist in drafting, research, or workflow management—but cannot replace human legal responsibility.
3. Embedding Governance into AI Workflows
To remain compliant while benefiting from AI, firms should embed strong governance mechanisms into their operations. A simple but effective governance checklist might include:
A. Policy Design
Publish a clear internal policy stating that:
- AI tools and non-authorised staff may assist but may not conduct litigation.
- Define what counts as support (e.g., research, drafting, bundling) and what counts as conduct (e.g., issuing proceedings, signing statements of truth).
B. Role Assignment (RACI Model)
Every litigation task should have an Authorised Solicitor marked as Responsible and Accountable.
AI systems and support staff should only hold Supporting roles.
C. Process Controls
- Restrict e-filing and document submission rights to authorised solicitors.
- Require digital sign-off prompts confirming authorisation before any formal submission.
- Prevent AI from directly triggering submission or signature endpoints.
D. Audit and Evidence
Maintain detailed logs of:
- Drafts, approvals, and edits,
- Who signed off and when,
- Submission IDs for traceability.
Such evidence may be reviewed by COLP/COFA if questions of compliance arise.
E. Training and Communication
Train all team members on the “green” and “red” boundaries:
- Green (Permitted): AI drafting, summarising disclosure, preparing bundles, scheduling, or research.
- Red (Prohibited): Filing or serving claims, signing statements of truth, submitting formal applications, or making concessions.
All client-facing AI chat interfaces should clearly state:
“AI Assistant – not a solicitor. Legal advice is provided only by authorised professionals.”
4. The Broader Implications
This decision underscores the human-in-the-loop principle of responsible AI adoption.
It demands that firms build trust through accountability, not delegation to algorithms.
When integrated properly, AI can increase efficiency, improve accuracy, and free solicitors from repetitive tasks. But the legal authority to act in court or before tribunals will always rest with qualified human practitioners—not automated systems.
Conclusion
The Mazur & Stuart ruling reaffirms that technology cannot replace legal authorisation.
AI can streamline, support, and enhance workflows—but it cannot conduct litigation, file documents, or make legal decisions without a human lawyer’s signature and oversight.
Law firms that design governance around these principles—balancing innovation with compliance—will be best positioned to harness AI’s power responsibly and lawfully.
References
- Legal Services Act 2007, ss.12–19, 21; Schedule 2 & 3.
- R v AUH [2023] EWCA Crim 6; Baxter v Doble [2023] EWHC 486 (KB).
- Civil Procedure Rules, Part 45, Sections VII–IX.
- Sheldon J, Mazur & Stuart v Charles Russell Speechlys LLP (EWHC (KB), 16 Sept 2025).
© 2026 Nexter AI Group. All Rights Reserved.
