INSIGHT
AI procurement in local government: legal risks and the case for clearer frameworks08 April 2026
By Yashpreet Panesar and Amardeep Gill
Welcome to the latest edition of Trowers Tech News.
This month, we examine major developments in data protection reform, AI policy, cybersecurity regulation and the UK’s strategic push to retain home‑grown technology. From new provisions under the Data (Use and Access) Act to the Government’s evolving stance on AI and copyright, fresh EU cyber rules and significant public investment in quantum computing and AI, the regulatory and commercial landscape for technology businesses continues to shift rapidly.
We begin with the latest provisions brought into force under the Data (Use and Access) Act, including the introduction of a recognised legitimate interest as a new lawful processing basis, enhanced safeguards for children accessing online services, and a simplified approach to international data transfers. Clarifications on DSAR handling, expanded electronic marketing exemptions for charities, and statutory changes affecting research further reflect the Government’s drive to recalibrate the UK’s data protection framework.
Next, we explore the Government’s latest position on AI and copyright. While legislative reform is on hold, transparency, licensing and enforcement have been identified as priority areas as policymakers navigate the competing interests of rights holders and AI developers. With questions emerging around computer‑generated works and digital replicas, businesses using or developing AI systems should expect prolonged uncertainty and rising compliance expectations.
We also reflect on the Chancellor’s £2.5bn commitment to quantum computing and artificial intelligence, aimed at preventing UK tech businesses from relocating overseas. The investment highlights growing concern around talent and capital flight and signals a broader strategy of closer regulatory alignment with the EU. For technology companies and their advisers, this raises important questions around cross‑border investment, corporate structuring and the future regulatory environment for emerging technologies.
Finally, we assess the European Commission’s draft guidance on the Cyber Resilience Act, which provides much‑needed clarity on how the new cybersecurity regime will apply to software, hardware and open‑source ecosystems. The guidance addresses scope, lifecycle requirements, substantial modifications, component‑level risk assessments and new vulnerability reporting duties - marking a pivotal step towards full implementation.
Top tech trends:evolving digital regulation


Victoria Robertson, Partner and Chris Doherty, Associate
On 5 February 2026, a significant number of provisions of the Data (Use and Access) Act 2025 (DUAA) came into force pursuant to the Data (Use and Access) Act 2025 (Commencement No. 6 and Transitional and Saving Provisions) Regulations 2026.
Perhaps one of the most significant changes is the introduction of "recognised legitimate interests" as a new lawful basis for processing personal data under section 70 of the DUAA. Qualifying interests include processing necessary for national security purposes, the investigation of crime, responding to requests from public bodies, and the safeguarding of vulnerable individuals. Importantly, controllers relying on this basis are not required to carry out a balancing test weighing the benefits of processing against the potential impact on individuals' rights. This is a notable departure from the existing "legitimate interests" basis under UK GDPR, which requires controllers to conduct such a balancing exercise, and it is likely to be welcomed by organisations whose processing activities fall within the qualifying categories.
The DUAA also introduces enhanced safeguard requirements for online services likely to be accessed by children. Such services must now account for children's higher protection matters, recognising both the vulnerabilities of children and the specific protections their personal data warrants. This obligation reinforces that child safety online is a compliance priority that businesses operating consumer-facing digital services must take seriously.
On international data transfers, section 85 of the DUAA reforms the process by which the Secretary of State may designate a third country or international organisation as providing adequate protection for personal data. Under the new framework, the Secretary of State must assess whether the standard of data protection in the recipient country is "not materially lower" than that afforded under UK law - a more flexible formulation than the existing adequacy standard, which is derived from the UK GDPR. Where such a designation is made by regulation, transfers of personal data to that country or organisation may proceed without the need for additional safeguards such as International Data Transfer Agreements (IDTAs).
Further changes include welcome clarifications to data subject access requests (DSARs). Controllers will now be able to formally demonstrate that seeking clarification from a requestor is reasonably required in order to respond meaningfully to a DSAR. Where clarification is requested, the applicable time limit for responding is paused until the requested information is received. Separately, the soft opt-in exemption for electronic marketing is now extended to charities, allowing them to make regular contact with existing supporters without requiring fresh consent for each communication. Additionally, the DUAA introduces a new statutory definition of "research," together with confirmation that broad consent can be given for data processing in certain research contexts. These changes should provide welcome certainty for research organisations and charities alike.
Looking ahead, the ICO has published good practice guidance on data protection complaints handling. Under the guidance, organisations must provide a formal complaints mechanism, acknowledge receipt within a reasonable period - typically 30 days, and resolve complaints without undue delay. In light of the new provisions and the ICO's updated enforcement focus, businesses should now prioritise reviewing and updating their data protection policies, procedures and records of processing activities to ensure they remain compliant.


Alice Stripe, Senior Associate and Vaughan Somerville, Associate
The UK Government released its long-awaited report and impact assessment on copyright and AI, as required under the Data (Use and Access) Act 2025. After receiving over 11,500 responses from rights holders, AI developers, publishers, and legal professionals, the reports represent the most substantial policy statement on this issue to date.
In short, the Government's previously preferred general text and data mining exception and opt-out option approach has been abandoned following significant opposition. Rather, the Government is taking a step back, collecting further evidence and pursuing further stakeholder engagement before any legislative reform. In place of legislative change, the Government is prioritising three areas:
AII developers should plan for continued legal uncertainty and be aware of the EU AI Act requirements and US market developments, which are already shaping global practice. With transparency and licensing taking centre stage, all businesses using AI, whether for document summarisation or retrieval-augmented generation, should audit their copyright exposure at both the input and output stages.
In addition, the report also signals emerging issues beyond training data, including the potential removal of copyright protection for computer-generated works without a human author, and the possibility of a new personality right to address non-consensual digital replicas of voices and likenesses. Whilst no immediate legislation is proposed on these points either, they signal the broader direction of policy development.
Where to next? Only time will tell, as the Government has committed to a period of evidence-gathering, stakeholder engagement, and close monitoring of international developments (EU AI Act implementation and ongoing litigation) before considering any legislative reform.

Chancellor Rachel Reeves has pledged £2.5 billion of government investment in quantum computing and artificial intelligence, with an explicit goal of stopping British technology from "drifting abroad". The announcement reflects growing concern at the highest levels of government about a well-established pattern: many tech firms that start in the UK end up moving their businesses overseas, often to the US. The suggested reasons for this talent and capital flight are varied, including poor investment from the UK government and pension funds, the perceived weakness of the London Stock Exchange, and better tax breaks available elsewhere. For businesses and their advisers, this is not merely a political story - it has direct commercial and legal implications for how UK tech companies are structured, financed, and ultimately governed.
Quantum computing is considered far more powerful than regular computing because it can store many times more information, and is seen by many as the next big breakthrough in technology and a key driver of economic growth. Ashley Montanaro, co-founder and CEO of British quantum firm Phasecraft, acknowledged there had been high-profile examples of UK-based firms being acquired by larger overseas companies or their founders relocating to the US, while welcoming the Chancellor's focus on retaining the UK's position as a leading destination for quantum computing.
The investment forms part of a broader growth strategy, which Reeves has paired with plans for closer ties with the EU and greater regional powers. She has indicated that the UK should align more closely with EU rules where it is in the national interest, with alignment in food and farm standards already planned, but her words potentially signalling a wider push across areas such as chemicals and manufacturing.
Whether this injection of public capital will be sufficient to reverse the trend remains to be seen. What is certain is that for law firms advising in the tech sector, questions of cross-border investment, corporate restructuring, and the regulatory landscape for emerging technologies will only grow in significance in the months ahead.


Anna Horsthuis, Senior Associate and Alina Kazmi, Trainee Solicitor
On 3 March 2026, the European Commission published its first draft guidance on the EU Cyber Resilience Act (CRA). This is a pivotal moment for manufacturers, developers and distributors of hardware and software products across the EU market, providing the clearest signal yet of how mandatory cybersecurity requirements will apply in practice.
The guidance resolves several long-standing grey areas on software scope. Software made available by download or remote access is in scope, while demo and tutorial code is not. A genuine data connection is required; using electricity alone is insufficient.
On open source, responsibility turns on who controls a project through governance rather than who holds commit rights. Free software can trigger full manufacturer obligations where it is monetised.
Updates that introduce new threat vectors may constitute a substantial modification, resetting CRA obligations entirely and making the modifier the manufacturer. The five year minimum support period is not a universal default and must reflect realistic product lifecycles, with each software version requiring its own declared period.
Product classification depends on core functionality as a whole rather than individual components, and third party and open source components must be actively risk assessed. Remote data processing falls within scope only where the product functionally depends on it and the manufacturer controls the software. Vulnerability reporting obligations include a 24 hour early warning and 72 hour full notification requirement.
Key dates to note:
For businesses in regulated sectors such as automotive or medical devices, the guidance also touches on interactions with other EU legislation and sector-specific scope exclusions. The consultation closed 31 March 2026. Businesses should review the draft against existing CRA programmes, reassess scoping and product classifications, refine support period strategies and establish clear criteria for identifying substantial modifications.
Organisations must take appropriate security measures to protect personal data from unauthorised access.
The committee raises numerous critiques against the current version of the Bill.
New platform Moltbook allows AI bots to post and chat like humans on Reddit but raises security and authenticity questions.
Takeaways from roundtables highlight the transformative potential and systemic risks of AI adoption across the financial sector.
61 global authorities sound the alarm over AI image generation tools, warning significant risks to individuals' privacy.
Alexa+ turns into a chatty device which follows realistic threads and provides proactive responses.
A major data glitch at Companies House allowed some users to alter other businesses' information.
Google's AI-generated medical advice lacks prominent safety warnings. Disclaimers only appear after users click for more information in smaller font.
08 April 2026
By Yashpreet Panesar and Amardeep Gill
31 March 2026
By Amardeep Gill and Matt Whelan