How can we help you?

Local authorities across the UK face mounting legal and operational risk in their procurement of artificial intelligence (AI) systems, owing to the absence of clear, comprehensive, and consistent guidance on how to act in the public interest. A new report from the Ada Lovelace Institute has brought these vulnerabilities into sharp focus, prompting questions about accountability, fairness, and the adequacy of existing legislative and regulatory frameworks.

The Ada Lovelace Institute, funded by the Nuffield Foundation, reviewed 16 pieces of guidance and legislation relevant to AI procurement published between 2010 and 2024. Its analysis identified significant deficiencies in the existing framework: concepts such as fairness lack workable definitions; public benefit remains undefined in practical terms; and transparency obligations are insufficiently developed to be meaningfully enforceable or understood by those subject to AI-driven decisions.

Rather than providing clarity, the volume of Government guidance has compounded the problem. Overlapping and at times contradictory frameworks mean that procurement officers, often operating within severely constrained budgets, are left to navigate a fragmented legal and policy landscape without adequate support. The Institute found that this exposes local authorities to "significant challenges," with the risk of non-compliance or poor procurement decisions a real and present concern.

To address these deficiencies, the report puts forward a series of recommendations with direct relevance to procurement governance and legal accountability:

  • Standardised guidance with clear definitions, measurable success criteria, and explicit allocation of legal responsibilities across procuring bodies;
  • Mandatory or structured use of accountability tools, such as the Algorithmic Transparency Recording Standard, to ensure public-facing AI systems can be scrutinised and challenged; and
  • The development of impact assessment frameworks, piloted in practice, to embed due diligence into the procurement process and give affected communities a meaningful right to participate in decisions that concern them.

Imogen Parker, Associate Director at the Ada Lovelace Institute, emphasised that procurement is not merely an administrative function but a critical mechanism for ensuring legality, safety, and fairness in the delivery of public services. She noted that procurers must be in a position to stand behind the products they purchase, with confidence that both they and the public are adequately protected. Drawing on a high-profile example, Parker pointed to the Post Office's Horizon scandal as evidence of the very real financial and ethical cost of failing to embed robust procurement standards, describing it as a lesson that should inform how public sector bodies approach AI adoption today.

The Local Government Association (LGA) welcomed the report's conclusions. In its response, the LGA highlighted that councils routinely encounter barriers and additional costs when seeking to innovate, and that the procurement of AI is no exception. Of particular concern to the LGA is the structure of the AI supply market: without targeted intervention, existing procurement weaknesses risk entrenching the dominance of a small number of large technology companies, undermining competition and limiting the options available to local government. The LGA called for stronger assurance frameworks and a clearer understanding of how procurement law interacts with the distinctive characteristics of the AI market.

The report's findings underscore a gap that central government must urgently address. For legal practitioners advising public sector clients, the message is clear: in the absence of a coherent statutory or regulatory framework, local authorities are exposed to significant risk every time they procure an AI system. Until that framework is in place, robust contractual protections, rigorous supplier due diligence, and careful governance documentation will be essential tools in managing that exposure.