ICO police cloud guidance released under FOI

The Information Commissioner’s Office (ICO) has provided Police Scotland with advice on how to make its cloud deployments adhere with police-specific data protection laws, but notes that the guidance “does not constitute approval for the roll-out or assurance of compliance”.

Released by the Scottish Police Authority (SPA) under freedom of information (FOI), the advice sent to Police Scotland – which comes over a year after Computer Weekly revealed its Digital Evidence Sharing Capability (DESC) pilot was rolled out with major data protection concerns in January 2024 – provides further detail on the ICO’s stance that UK police can legally use hyperscale public cloud infrastructure.

While the regulator previously confirmed to Computer Weekly in January 2024 that it believed UK police can legally use cloud services that send sensitive law enforcement data overseas with “appropriate protections” in place, it declined to specify what these protections are.

The advice released under FOI now clarifies that the ICO believes compliance can be achieved through the use of interrelated international agreements, namely the UK’s International Data Transfer Agreements (IDTA) or the Addendum to the European Union’s Standard Contractual Clauses (SCCs).

The ICO advice – signed by deputy commissioner Emily Keaney – further explained the kinds of data protection due diligence it believes are required by police forces to ensure the data flows are properly mapped and authorised, and also clarifies the pathways through which the US government can access the policing data via the Cloud Act; which allows US authorities to access data from communication providers operating in its jurisdiction under certain circumstances.

However, data protection experts have questioned the viability of these routes, claiming it is not clear how the ICO has concluded that these controls – which are rooted in the UK General Data Protection Regulation (GDPR) rules – can also be applied to strict law enforcement-specific rules laid out in Part Three of the Data Protection Act (DPA) 2018, and whether these mechanisms can in fact prevent US government access.

Despite forces looking to the ICO for guidance on the matter, the regulator was also clear that it is up to the data controllers themselves (i.e. the policing bodies involved in DESC) to figure out and decide for themselves if these protections would in fact make the data storage and processing taking place legal. “The ICO actually said that if you rely upon the advice and it turns out to be wrong, or you are found to have breached the Act, they can and shall still prosecute,” said independent security consultant Owen Sayers, who the guidance was disclosed to under FOI. “So, it’s about as useful as a sunroof in a submarine.”

Legal responsibilities

Commenting on the ICO advice, legal and policy officer at Open Rights Group Mariano delli Santi said that while policing bodies have legal responsibilities as controllers to conduct all of their own due diligence – and should be expected to do so – the regulator also has a duty to supervise how public authorities are using these systems. “It doesn’t really seem like the ICO is scrutinising international data transfer issues in this area,” he said, adding that the ICO must take an active interest in pushing policing bodies to apply the law. “How are they supervising? What audits have they carried out of public authorities relying on these systems?”

Based off the same set of FOI disclosures, Computer Weekly previously reported details of discussions between Microsoft and the Scottish Police Authority (SPA), in which the tech giant admitted it cannot guarantee the sovereignty of UK policing data hosted on its hyperscale public cloud infrastructure.

Specifically, it showed that data hosted in Microsoft infrastructure is routinely transferred and processed overseas; that the data processing agreement in place for DESC did not cover UK-specific data protection requirements; and that while the company has the ability to make technical changes to ensure data protection compliance, it is only prepared to make these changes for DESC partners and not other policing bodies because “no-one else had asked”.

The documents also contain acknowledgements from Microsoft that international data transfers are inherent to its public cloud architecture, and that limiting transfers based on individual approvals by a Police Force – as required under DPA Part 3 – “cannot be operationalised”.

Computer Weekly contacted the ICO about every aspect of the FOI disclosures – including whether Microsoft’s admissions about data sovereignty would change its advice – but it declined to answer any specific questions on the basis that it is prevented from doing so by the “pre-election period of sensitivity”.

However, a spokesperson for the ICO said: “This is a complex issue with several factors to consider, so we have taken the necessary time to review and provide our stakeholders with relevant guidance. We consider that law enforcement agencies may use cloud services that process data outside the UK where appropriate protections are in place.

“Data protection legislation is a risk-based framework which requires all organisations to be accountable for the personal information they process,” they said. “We expect all organisations, including law enforcement agencies, to appropriately assess and manage any risks associated with their own processing of personal information. We have carefully considered compliance in this area and continue to provide advice to law enforcement agencies across the UK on using technologies in a way that complies with data protection law.”

Ongoing police cloud concerns

Since Computer Weekly revealed in December 2020 that dozens of UK police forces were processing over a million people’s data unlawfully in Microsoft 365, data protection experts and police tech regulators have questioned various aspects of how hyperscale public cloud infrastructure has been deployed by UK policing, arguing they are currently unable to comply with strict law enforcement-specific rules laid out in the DPA.

At the start of April 2023, Computer Weekly then revealed the Scottish government’s Digital Evidence Sharing Capability (DESC) service – contracted to body-worn video provider Axon for delivery and hosted on Microsoft Azure – was being piloted by Police Scotland despite a police watchdog raising concerns about how the use of Azure “would not be legal”.

Specifically, the police watchdog said there were a number of other unresolved high risks to data subjects, such as US government access via the Cloud Act, which effectively gives the US government access to any data, stored anywhere, by US corporations in the cloud; Microsoft’s use of generic, rather than specific, contracts; and Axon’s inability to comply with contractual clauses around data sovereignty.  

Computer Weekly also revealed that Microsoft, Axon and the ICO were all aware of these issues before processing in DESC began. The risks identified extend to every public cloud system used for a law enforcement purpose in the UK, as they are governed by the same data protection rules.

In January 2024, in response to questions from Computer Weekly about whether it also uses US-based hyperscale public cloud services for its own law enforcement processing functions, the ICO sent over a bundle of DPIAs 495 pages of documents detailing a number of systems in use by the ICO.

According to these documents, the ICO is explicit that it uses a range of services that sit on Microsoft Azure cloud infrastructure for law enforcement processing purposes. However, it declined to provide any comment on its legal basis or conducting such processing, and the extent to which its own use of these cloud services has prevented it from reaching a formal position on whether the use of these services conflicts with UK data protection rules.

The ICO advice

The regulator’s view that the use of hyperscale public cloud services by UK law enforcement bodies can be legal if “appropriate protections” are in place is outlined in emails sent to the SPA on 2 April 2024.

In the correspondence, the data regulator details two main pathways that they feel would enable DESC to comply with Part Three’s stringent transfer requirements.

“First, where UK GDPR adequacy regulations apply, in most cases, you will be able to rely on Section 75(1)(b) that you have assessed all the circumstances and decided that appropriate safeguards exist to protect the data; or second, by relying on a Section 75(1)(a) ‘legal instrument containing appropriate safeguards for protection of personal data’ which binds the recipient of the data,” said the ICO’s deputy commissioner for regulatory policy.

“We consider that the IDTA or the Addendum to the EU SCCs (the ‘Addendum’) are capable of meeting this requirement. However, you are responsible for carrying out due diligence to ensure that in the specific circumstances of your transfer, and in particular the often sensitive nature of Part 3 data, the IDTA or Addendum does provide the right level of protection.”

While the IDTA is a legal contract published by the ICO to safeguard personal data being sent outside of the UK to certain third countries, the SCCs are contracts produced by the European Commission to protect data flows from the EU.

In force since March 2022, UK organisations can either use the IDTA as a standalone document, or use the “UK Addendum” to the EU SCCs to make the “restricted transfers” compliant with UK data protection law. However, Sayers said this mechanism can help with UK GDPR compliance, and does not extend to Part Three of law enforcement processing.

“It’s surprising that the ICO has referred to UK GDPR adequacy in their guidance, and not Law Enforcement [LED] adequacy” he said. “Whilst many countries enjoy GDPR adequacy from the UK and Europe, very few have LED adequacy, and it’s the latter that would be required for Policing purposes. It’s not clear how the regulator has made such a simple mistake.”

International transfers

The ICO added that whether or not an international transfer is being made to the cloud service provider as a processor, the nature of cloud services means that it is “very likely” there will be further international transfers by the cloud service provider to its sub processors, which is the responsibility of the policing bodies as controllers to have mapped out.

“Your responsibility (under Section 59) is to ensure that the cloud service provider only engages overseas sub-processors with your authorisation and is giving you sufficient guarantees that it has in place appropriate technical and organisational measures that are sufficient to secure that the processing will (a) meet the requirements of [Part 3] and (b) ensure the protection of the rights of the data subject,” it said.

“As part of your due diligence, for those sub-processors which are not in a country with the benefit of a UK GDPR adequacy regulation, you will need to be satisfied that the cloud service provider’s contracts with its sub processors contain a Section 75 appropriate safeguard. In the same way that you can make restricted transfers under Part 3, a cloud service provider will be able to rely on the IDTA or Addendum, provided they carry out a TRA [Transfer Risk Assessment].”

Computer Weekly contacted the ICO, Police Scotland and Microsoft for confirmation on whether any transfer risk assessments had been carried out, but did not receive a response to this point.

Further information

The advice also provides further information on how the due diligence responsibilities of policing bodies can be applied when entering into a contract with cloud service providers.

It says, for example, that police forces should take into account whether an IDTA or an Addendum is contained in the contractual commitments; whether the TRA carried out confirms it provides an adequate level of protection; and whether the processor is obliged to update the controller about changes to its list of sub-processors.

“We are aware that clarifying amendments to Part 3 DPA have been tabled under the Data Protection and Digital Information Bill, intended to provide greater legal certainty in relation to international data transfers for controllers and processors transferring personal data for law enforcement purposes,” it added.

However, according to Nicky Stewart, a former ICT chief at the UK government’s Cabinet Office, if law enforcement data controllers such as Police Scotland are relying on SCCs to provide equivalent protection to keeping all of the data in the UK, “we might as well just send all of the data to the US”.

Noting numerous legal challenges against using SCCs as a transfer mechanism for European data to the US (due to legislation such as the Cloud Act that allows the US government to access company data), she added that the guidance “seems very weak”.

Computer Weekly asked the ICO about its reliance on UK GDPR mechanisms and other claims made about the guidance, but received no specific responses to these points.

The Cloud Act

A follow-up email from the ICO’s regional manager for Scotland also provides more clarity and detail on how the US government could potentially extract UK law enforcement data from Microsoft or Axon.

They said the first pathway is for a US public authority to serve a qualifying lawful US order on an organisation which falls within US jurisdiction: “Such orders require the organisation to provide information in its possession, custody, or control regardless of where in the world that information is stored.

“Information processed by a UK company may be accessed via this pathway by an order served directly on the UK company (if US jurisdiction can be established) or indirectly by an order served on the US parent company (if it can be established that the US parent company has the necessary possession, custody, or control of the requested data).”

They added that the second pathway is for a US authority to serve an order on a UK communication service provider under the UK-US Data Access Agreement: “This Agreement incorporates additional safeguards, in particular preventing access to data relating to individuals located in the UK and the use of obtained data in death penalty cases.”

They noted that while the ICO does not consider that policing bodies covered by Part Three must stop using cloud services because of concerns over the Cloud Act and data protection compliance, the Act does not alter organisations’ data protection obligations.

“Whichever pathway is used, UK data protection law provides safeguards for individuals and each request must be considered individually on its merits,” they said. “For both pathways, in practice, recipients of requests may find they need to open a dialogue with the US public authority making the request (or with the US Department of Justice’s Office of International Affairs for orders made under the UK-US DAA), for example, in order to clarify or verify the legality of the request and ensure their response complies with UK data protection law.”

Generic advice

Commenting on the Cloud Act elements of ICO advice, Delli Santi further described it as “generic”, and noted the efforts of Dutch public sector bodies to proactively identify, map and mitigate various risks associated with the use of Microsoft Teams, OneDrive, SharePoint and Azure Active Directory.

A DPIA on the use of these services commissioned by the Dutch Ministry of Justice said that although Microsoft mitigated a number of risks identified by the assessment, the fact that the data can be ordered through the Cloud Act means “there is a high risk for the processing of sensitive and special categories of data … as long as the organisation cannot control its own encryption keys.

“Even if the likelihood of occurrence is extremely low, the impact on data subjects in case of disclosure of their sensitive and special categories of personal data to US law enforcement or security services can be extremely high,” it said. “This is due to the lack of notification and the lack of an effective means of redress for EU citizens. This risk even occurs when these data are exclusively processed and stored in the EU.”

For Delli Santi, given everything that is public knowledge about how these systems work, it raises the question of “why don’t they [the ICO] just straight-up conduct and audit? To me, it seems like there’s a lot of smoke, so maybe you want to check if there’s something burning.”

While the SPA DPIA for DESC explicitly noted that the encryption keys are held by Axon, rather than Police Scotland, the ICO advice does not mention anything about the need for organisations to control their own keys; or the fact that encryption is not considered to be a relevant or effective safeguard under Part Three (as it does not allow for “supplementary measures” that would enable data to be sent to jurisdictions with demonstrably lower data protection standards, such as the US).

Computer Weekly asked the ICO whether it has conducted any audits, as well as the ICO’s view on encryption, but received no response on these points.

For the avoidance of doubt, figure it out

While the ICO advice already explicitly stated that police forces must do their own due diligence on whether the IDTA or the Addendum would make their transfers via hyperscale public cloud architecture compliant, the follow-up email outlining details of the Cloud Act takes it further by stating that its advice should not be taken as ICO approval or assurance of the deployment.

“For the avoidance of doubt, the advice we have provided is under our general duty to provide advice and support, and does not constitute approval for the roll-out or assurance of compliance under data protection law,” it said. “The advice does not compromise our ability to use our regulatory powers in the future should any infringements come to light.”

Computer Weekly asked the ICO about the source of its advice, and whether the ICO sought its own legal advice to inform its guidance for DESC, but received no response on these points.

Computer Weekly also asked whether it is realistic – given the poor state of due diligence throughout the criminal justice sector in relation to cloud deployments – to expect police forces to accurately assess the risks and ensure all Part Three requirements are being met, but received no response on this point.

Commenting on the guidance, Stewart said that outlining the appropriate protections while putting all the legal risk back on Police Scotland “doesn’t seem to be particularly helpful”.

In terms of climbing out of the situation, she said that while there is no easy fix, there are options, which include either backtracking out of Microsoft deployments and migrating all of the data over to Part Three-compliant cloud suppliers, or have Microsoft be prepared to deploy solutions that are “effectively wholly sovereign”, and which are able to buffer US government access and “follow the sun” arrangements.

However, she added that this will clearly drive up cost: “Either way, it’s going to be more expensive, and I suspect fundamentally what this is boiling down to is the cost to Microsoft to make concessions, or to the police forces.”

Sayers broadly agreed, but noted that making the necessary changes to Microsoft’s terms of service and technical platform would not be trivial. “I raised this with Microsoft in emails in Q1 2019, and laid out all of the steps they would need to take to comply with the DPA,” he said.

“They elected not to make those changes, but instead to rely on Police Forces doing their diligence to confirm the suitability or otherwise of their services. It’s taken some time for someone to ask them the right questions, but clearly now the SPA have done so, Microsoft have been open that their service doesn’t meet the requirements today.”

Scottish biometrics commissioner Brian Plastow – who issued Police Scotland with a formal information notice over DESC in April 2023 and previously shared concerns about unauthorised access to Scottish law enforcement data in an open letter published in October 2023 – said the ongoing uncertainty around police cloud deployments would benefit from a formal investigation by the ICO.

“I would welcome an investigation by the ICO into whether the specific law enforcement processing arrangements for DESC by Police Scotland and DESC partners in Scotland, which includes biometric data, is fully compliant with UK data protection law,” he said.

“Principle 10 of the Scottish Biometrics Commissioner’s Code of Practice approved by the Scottish Parliament in November 2020 also requires Police Scotland to ensure that biometric data is protected from unauthorised access and unauthorised disclosure in accordance with UK GDPR and the Data Protection Act 2018,” said Plastow.

“Therefore, compliance with the ICO requirements is a key compliance feature of the Scottish Code of Practice. However, only the ICO has the statutory authority to determine compliance (or not) with UK data protection law, and it would appear that the ongoing level of uncertainty around DESC is such that it would benefit from specific investigation by the ICO.”


Given the ICO’s own use of Azure for law enforcement processing, Computer Weekly asked whether this had an impact on its decision-making, but received no response on this point.

Sayers said that given the ICO is a regulator, it should have never offshored Part Three data from the UK, “yet their own DPIAs show they knew they were doing so even before this Microsoft information was received”, he said. “They’ve repeated the same mistake as lots of other UK public sector bodies by assuming that because Microsoft have some UK datacentres, this means the data actually stays in, and is supported from, those locations. That’s not how Public Cloud actually works.”

Sayers added that the ICO must answer questions about what steps it has taken to address this processing themselves, as well as how they came to the conclusion that a hyperscale cloud could meet their needs given they are constrained by Section 73(4) of the DPA from sending this type of data outside of the UK to an IT service provider.

However, while the ICO noted the policing bodies involved as data controllers are responsible for ensuring DESC compliance prior to its roll-out, the regulator previously let the pilot go ahead with live personal data while in full view of the risks.

Although this has been public knowledge since Computer Weekly initially reported on DESC in April 2023, the new correspondence disclosed to Sayers provides further detail on why the ICO and Police Scotland did not undertake a formal consultation process, despite both parties being aware of the data protection concerns. This will be covered in an upcoming Computer Weekly story.