Local Government Lawyer

SharpeEdge

Beatrice Wood explores the impact of Automated Decision making on the UK government and local authority sector.

Last week, as per its duty under s3(1) of the Law Commissions Act 1965, the Law Commission (the “Commission”) published its 14th Programme of Law Reform.

This Programme outlined ten new projects it will be focusing on over the next few years. These will sit alongside the seventeen other existing law reform projects that the Law Commission is currently taking forward as recommendations for Parliament to consider.

One project of particular interest to public sector authorities, is ‘Public sector automated decision-making’ (“ADM”) – but what does ADM entail exactly? The Commission states that it encompasses decisions made by or for public bodies using algorithmic processes and artificial intelligence”.

Across the country, ADM is already being used in areas such as policing, social services, child welfare, fraud detection and healthcare. Clearly, its use will only become more prevalent as technological advances continue, and the Commission raises several concerns with the current legislative approach (or arguable lack thereof) to ADM.

Specifically, it highlights that there is no specific legal framework governing the use of ADM by the state to make decisions affecting the public: public law developed to ensure the accountability of human officials and not automated systems. […] At the same time, judicial review is not well-suited to scrutinising decisions made using ADM.”

Given these concerns, coupled with issues raised in industry commentary and specific case studies, it is no surprise that ADM has been tabled as a focus-project for the Commission.

Commentary on reviewable automated decision-making in the public sector

The use of ADM in the public sector has been the subject of extensive commentary for many years. For example, in 2019, Philip Alston, a UN Special Rapporteur on extreme poverty and human rights, reported on the use of ADM in welfare systems across a number of States.

Expressing concerns over the introduction of more demanding and intrusive forms of conditionality that ADM may lend itself to, he found that there had been “a complete reversal of the traditional notion that the State should be accountable to the individual[1].

This report particularly raised questions as to whether it is appropriate for areas, such as welfare to be left to the design choices of technological developers, highlighting the sometimes rigid dichotomy between the need for optimisation and efficiency, versus the strict application of the rule of law and principles of “good government”.

A case for reform: South Wales Police and Automated Facial Recognition

Alongside the concerns raised by legal commentators over the use of ADM in the public sector, the lawfulness of specific instances of ADM-use have also been challenged in courts.

R (Bridges) v CC South Wales[2] : lawfulness of use of live automated facial recognition technology (“AFR”) by South Wales Police

In this 2019 case, the Court of Appeal found that the use of AFR by the South Wales Police was in breach of several key pieces of legislation, which led to declaratory relief being granted:

  • Firstly, the use of AFR by South Wales Police engaged Article 8(1) of the ECHR, which protects the right to respect for private life, but was not justified under Article 8(2) due to the lack of a sufficient legal framework. This was largely due to the discretion left to police officers regarding who could be placed on a watchlist and where AFR could be deployed.
  • Further, the Court noted that the Data Protection Impact Assessment prepared by South Wales Police failed to properly assess the risks to the rights and freedoms of data subjects, as required by section 64(3)(b) and (c) of the Data Protection Act 2018.
  • Finally, the judgment states that South Wales Police’s approach to complying with the Public Sector Equality Duty set out in section 149 of the Equality Act 2010 was inadequate, as it did not sufficiently consider the potential for AFR technology to produce discriminatory impacts based on race and gender.

Key implications

The judgment emphasised the need for a clear and sufficient legal framework (with robust safeguards and clear criteria) to govern the use of AFR technology, to ensure that any interference with privacy rights is lawful, necessary, and proportionate.

It also underscored the necessity for public authorities to conduct thorough data protection impact assessments that adequately address risks to privacy and ensure compliance with data protection principles.

The Court also stressed the ongoing duty of public authorities to consider the potential for indirect discrimination and to take reasonable steps to mitigate such risks, particularly in the context of novel technologies.

Do administrative law principles align with a move towards reform in automated decision-making?

There are two key administrative law principles that align with the argument that ADM could benefit from legal reform. In particular:

Principle 1: Consideration of all relevant factors and exclusion of irrelevant factors

The public law case of Anisminic v Foreign Compensation Commission[3], established the principle that decision-makers are required to consider all relevant factors and exclude irrelevant information when making decisions.

However, in algorithmic systems, irrelevant factors that may be used in the commissioning and building stages, could pass through to the ultimate decision-making stage.

The use of ‘proxies’ (i.e. to avoid the use of sensitive information) by automated programmes in these earlier stages may therefore be problematic, and potentially unlawful from an administrative law perspective.

Principle 2: Reviewability of the decision-making process (rather than the decision itself)

Within judicial review, the standard of a court’s review is not to step into the shoes of the decision-maker, but rather to assess the decision-making process that was followed. To do this, courts break the decision-making process down into its constituent parts and take a holistic view as to whether the process (including its development, context and outcomes) is in line with public law principles.

In a similar vein, separating an ADM process into its earlier development and later decision-making stages before assessing the rationality of each, would more likely build public confidence in the reviewability of ADM systems. Further, a more transparent approach to the wider socio-technical processes, of which the technology is developed within, is a key way to increase accountability in this area.

What form might the Commission’s legal reform take, and what developments should public authorities watch out for?

Though the Commission is in the very early stages of research on this topic, there are several areas it may investigate against the backdrop set out below. The following are potential areas to watch out for:

  • Transparency: are your ADM systems subject to sufficient testing and auditing measures? If it became a legal requirement[4], would you be able to disclose information about all stages of an ADM process used, from the system’s procurement through to its training and deployment stages?
  • A move from individual to systemic reviewability: given the re-use of specific algorithms within multiple outputs of machine-learning, it is likely that an error or bias highlighted in one output will appear in others on a more systematic level. The Commission may highlight this as an area of concern, and perhaps recommend loosened rules, in relation to public law “standing” tests, to allow individuals to seek remedies for systematic ADM mistakes. Public authorities may wish to assess whether, when using ADM systems, they are able to sufficiently identify and address any errors and biases. Do you have the oversight to assess your algorithmic systems as a whole?
  • Limitations on the kinds of data that can be inputted into ADM, and purposes for which ADM can be utilised: as highlighted by both the specific commentary and case study assessed above, there may be data sets and purposes for which ADM-use is more controversial. As such, there may be a suggestion from the Commission to tighten where ADM can be lawfully used within the public sector, and public authorities should watch out for the introduction of increased regulation in this area.

For now, public bodies have been encouraged to make use of AI tools by the Government.

However, in doing so, they will need to consider and implement the principles for safe and responsible use of these systems, as set out in Government Digital Service: Artificial Intelligence Playbook for the UK Government.

We are on hand to advise and assist public authorities in adapting to this evolving and uncertain area of law. It will be interesting to see how the Commission’s work on this develops over the next few years, and we will keep public authorities updated throughout.

Beatrice Wood is a Junior Associate at Sharpe Pritchard LLP.


For further insight and resources on local government legal issues from Sharpe Pritchard, please visit the SharpeEdge page by clicking on the banner below.

sharpe edge 600x100

Visit Sharpe Pritchard's new Building Safety Hub, focusing on The Building Safety Act 2022 and its wide-ranging impact.

This article is for general awareness only and does not constitute legal or professional advice. The law may have changed since this page was first published. If you would like further advice and assistance in relation to any issue raised in this article, please contact us by telephone or email enquiries@sharpepritchard.co.uk.

[1] 2019 Report of Philip Alston, UN Special Rapporteur on extreme poverty and human rights

[2] [2019] EWHC 2341 (Admin)

[3] [1969] AC 147

[4] Of course, competing IPR and confidentiality considerations will be a relevant counter-consideration for lawmakers here.

Click here to view our archived articles or search below.

ABOUT SHARPE PRITCHARD

Sharpe Light Blue Bar 435px

We are a national firm of public law specialists, serving local authorities, other public sector organisations and registered social landlords, as well as commercial clients and the third sector.

Our team advises on a wide range of public law matters, spanning electoral law, procurement, construction, infrastructure, data protection and information law, planning and dispute resolution, to name a few key specialisms.

All public sector organisations have a route to instruct us through the various frameworks we are appointed to. To find out more about our services, please click here.

Justin Mendelle signature

OUR RECENT ARTICLES

Sharpe Light Blue Bar 435px

October 09, 2025

Twelve New Towns for the Future

James Goldthorpe and Conrad Turnock summarise the New Towns Taskforce’s report to Government, published on 28th September, and ask - what next?
September 30, 2025

Key updates to the Administrative Court Guide

Chloe McQuillan and Jonathan Blunden dive into recent amendments to the Administrative Court Judicial Review Guide, exploring what they mean for practitioners.
Click here for our archived articles

OUR NEXT EVENT

Sharpe Light Blue Bar 435px

SharpeEdge Event Slide

OTHER UPCOMING EVENTS

Sharpe Light Blue Bar 435px

Slide backgroundSlide thumbnail

OUR KEY LOCAL GOVERNMENT CONTACTS

Sharpe Light Blue Bar 435px

Peter CollinsPeter Collins

Partner

020 7406 4600

Contact by email

Find out more
 

Catherine NewmanCatherine Newman

Partner

020 7406 4600

Contact by email

Find out more
 

Rachel Murray-Smith

Rachel Murray-Smith

Partner

020 7406 4600

Contact by email

Find out more