05/03/2024

Information Commissioner Update
ICO update

1. ICO Enforcement Action

Greater Manchester Police issued enforcement notice over failure to clear FOI backlog

The ICO has issued an enforcement notice to Greater Manchester Police (GMP) over its failure to clear its FOI backlog of 850 overdue requests, the oldest of which was submitted almost two and a half years ago. GMP has repeatedly failed to comply with FOI requests and was issued with a practice recommendation in February 2023. It failed to take appropriate action, so an enforcement notice has now been issued. Find the link to the ICO’s commentary on this here.

Organisations should ensure they review their FOI processes and assess whether they currently have a backlog of FOI requests. The ICO is clearly looking closely at public authorities who are not complying with their statutory responsibilities, and is prepared to issue reprimands to those found to be in breach. Also consider whether training may be required for staff dealing with FOI requests. Please get in touch with our Information Law & Privacy team if you would like assistance with this.

Bevan Brittan recently held a webinar covering the topic of FOI backlogs, a copy of the recording of the webinar can be accessed here: Freedom of Information – Backlogs and Compliance | Bevan Brittan LLP

The presenters highlighted a number of key practical steps in managing FOI workflows:

  • Ensure stakeholders at all levels of the organisation (senior management, team leads and team members) are fully engaged;
  • Communicate to the wider organisation why it needs to cooperate with requests, and what the consequences are of mishandling requests (in particular releasing too much information or sensitive information which could compromise their work);
  • Have a full suite of precedents and policies, to allow your team members to handle requests as efficiently as possible;
  • Train your team to efficiently triage requests from the outset, and spot ones which can instantly be refused (such as requests for sensitive third party service user data), or which will be complex (e.g.supplier commercial rates);
  • Consider developing an internal database or external publication scheme to allow your team to refer to previous responses or public data sources.

South Tees Hospitals NHS Foundation Trust reprimanded for “serious, harmful” data breach

South Tees Hospital NHS Foundation Trust received a reprimand from the ICO on 25 January 2024 in relation to a data breach. The breach involved a Trust employee sending a child’s appointment letter, containing extremely sensitive information, to the address of the family of the child’s mother instead of sending the letter to the child’s father. This caused significant distress to the father, child and family.

The ICO found that the Trust could provide no evidence of any formal process being in place to (i) ensure that patient data was regularly updated to ensure its accuracy, or (ii) appropriately prepare staff for their role in dealing with particularly sensitive correspondence. The Trust has been required to now implement new standard operating procedures and provide further staff training to ensure data is protected and reduce possible future disclosures from being made in error.

As flagged by the ICO, this breach highlights how even seemingly minor errors can have very serious consequences. To other organisations handling similarly sensitive data, this highlights the need to ensure that appropriate staff training is in place to minimise the risks arising from administrative errors.

Cookie warning

In November, the ICO contacted 53 of the UK’s top 100 websites, threatening enforcement action if the websites’ cookie compliance wasn’t improved. The ICO has undertaken this action on the back of a large number of complaints highlighting lack of compliance with the cookie tracking requirements of the Privacy and Electronic Communications Regulations (PECR).  Of those 53, 42 have either committed to compliance within one further month or already taken relevant steps. The ICO is now looking to roll out this compliance push more widely.

While the ICO is clearly taking a risk-guided approach and tackling the largest websites first, this is a clear warning to all organisations to be aware that cookies are a current enforcement priority for the ICO, alongside their current enforcement strategy around direct marketing and PECR.  In particular, the ICO is considering AI solutions to identify non-compliant cookie banners.  We would strongly suggest all organisations carry out basic audits of their cookie policies and usage before the ICO fully mobilises on this issue.

Use of facial recognition technology to monitor staff attendance deemed to be unlawful

The ICO has ordered Serco Leisure and seven associated community leisure trusts (who were acting as joint controllers with Serco Leisure) to stop using facial recognition technology and fingerprint scanning to monitor attendance of its leisure centre employees.

Serco Leisure had introduced the technologies to process 2000 employees’ biometric data at 38 leisure facilities; the purpose being to monitor staff attendance. Employees were not offered a clear alternative to the processing and the ICO determined that the imbalance of power between Serco Leisure and its employees meant it was unlikely employees would not feel able to oppose the processing of their data.

Serco Leisure failed to show why the processing was necessary or proportionate when alternative, less intrusive, methods of employee monitoring were available. In its media briefing, the ICO highlights that a security breach of biometric data has a much higher risk of substantial harm due to its wholly unique nature to an individual.

The key takeaway here is that organisations should ensure they fully consider the risks before introducing biometric technology to monitor staff. Read the ICO’s guidance on the use of biometric data here.

2. ICO Consultations

Employment records & recruitment and selection

The ICO is producing an online resource relating to employment practices and data protection. In connection with this, it has issued draft guidance on keeping employment records and recruitment and selection for public consultation. This draft guidance seeks to provide practical guidance about how to comply with data protection law when keeping records about your workers and carrying out recruitment exercises and to promote good practice. The consultation closes on 5 March 2024. Further information about this consultation is available here.

3. ICO Reports

Horizon scanning report

Earlier this month, the ICO released a ‘Tech Horizons’ report, highlighting what it considers to be some of the most significant privacy-related tech developments in the next 2-5 years. In the ICO’s opinion, these include:

  • Genomics (such as DNA testing or genome mapping)
  • Immersive virtual worlds (such as the Metaverse)
  • Neuotechnologies (tech which directly records and processes neuro-data)
  • Quantum computing (in effect, a significant leap in processing power which will cause significant issues for encryption)
  • Commercial drone use (for example by Amazon in deliveries, or by venues for crowd management)
  • Personalised AI (such as AI ‘boyfriends’ or ‘girlfriends’ already offered on the Open GPT store)
  • Next-generation search (which will be connected to AI and deep personalised data sets)
  • Central bank digital currencies

Generative AI (‘GenAI’) is a significant omission from the list – however the ICO is already currently carrying out a consultation into GenAI or Large Language Models, such as ChatGPT (more information on which is available here).

Some of the developments in the report are already clear privacy concerns – for example international DNA tester “23andMe” recently suffered a significant data breach, which compromised account details and ancestry data for millions of users. Others, such as neurotechnologies, are likely to remain largely hypothetical for a long time to come.

Regardless of the relevancy for any particular industry, the report provides very useful insight into the ICO’s expected medium term priorities.

 

UK Law & Policy Update

Update on DPDI Bill

The Data Protection and Digital Information Bill had its third reading in the House of Commons on 29th November 2023 and passed with 267 votes to 30. It has now moved to the House of Lords for further scrutiny. The Bill is currently at the Committee Stage of the House of Lords and further progress is expected during the course of 2024. 

The reintroduced Bill added a significant number of new clauses, on which the ICO has been consulted. The ICO's response includes concerns that some of these new clauses “amount to substantive new policy that has not been the subject of wider public consultation.” The ICO says it is “content” with and welcomes most of the changes, including, for example, removing the Secretary of State approval over statutory ICO codes of practice and the extension of the reporting period for personal data breaches under the Privacy and Electronic Communications Regulations from 24 to 72 hours, to align with UK GDPR. There are, however, concerns from the ICO about the proposed power to require information from banks and other financial institutions relating to individuals for social security purposes (purportedly to prevent benefit fraud), noting in particular that the measure is currently insufficiently tightly drawn in the legislation to provide the appropriate safeguards. 

We will continue to monitor the progress of the Bill and will keep readers apprised of any significant amendments.

Final Reminder - Deadline of 21 March 2024 to update old EU SCCs

You are required under the UK GDPR to take steps to ensure personal data is appropriately protected when it is transferred outside of the UK. Until 22 September 2022, one of the mechanisms that was available to put in place appropriate protection, was the old European Standard Contractual Clauses (old EU SCCs). 

To the extent you are still relying on old EU SCCs to transfer personal data outside of the UK, you have until 21 March 2024 to make alternative arrangements. These alternative arrangements could include putting in place either the UK’s International Data Transfer Agreement (IDTA), or the EU’s new Standard Contractual Clauses together with the international data transfer addendum (UK Addendum). Note you are also required to carry out a transfer risk assessment when entering into a contract based on the IDTA or the UK Addendum (the ICO has issued guidance on how to carry this out available here).  

To the extent that the old EU SCCs are not updated by 21 March 2024 this will be a breach of data protection law. 

Update on EU/UK AI Regulation

EU

The European Union’s new AI regulation legislation has taken a significant step forward, with the Council of Europe and the European Parliament having reached a provisional agreement on wording in December. This includes:

  • Safeguards on general purpose AI
  • Limitations on the use of biometric ID systems by law enforcement
  • Bans on AI designed to manipulate/exploit user vulnerabilities
  • Bans on social scoring
  • Consumer rights to launch complaints/receive meaningful explanations
  • Fines with caps from 1.5% of global turnover/€7.5 million to 7% global turnover/€35 million

While the bill will need further development, it’s very likely that the substantive points currently included will form the eventual content, and we would suggest affected organisations start reviewing their activities now, in order to avoid any short-notice changes that might be reminiscent of the GDPR’s introduction.

UK

Closer to home, in February the Department for Science, Innovation and Technology released new guidance titled Introduction to AI Assurance, available here: Introduction to AI assurance - GOV.UK (www.gov.uk)

The guidance seeks to set out a grounding for, in effect, considering and understanding AI, its use, and its regulation. This will therefore be particularly useful for executive members of organisations, who may ultimately be bearing the risk of AI implementation, but are not otherwise directly involved with its use. While the Government will be sure to consider further AI regulation in due course, this guidance provides a useful context for AI risk briefings in the meantime.

If you have any questions about the issues raised in this update, please contact a member of our Information Law team.

Our use of cookies

We use necessary cookies to make our site work. We'd also like to set optional analytics cookies to help us improve it. We won't set optional cookies unless you enable them. Using this tool will set a cookie on your device to remember your preferences. For more detailed information about the cookies we use, see our Cookies page.

Necessary cookies

Necessary cookies enable core functionality such as security, network management, and accessibility. You may disable these by changing your browser settings, but this may affect how the website functions.

Analytics cookies

We'd like to set Google Analytics cookies to help us to improve our website by collection and reporting information on how you use it. The cookies collect information in a way that does not directly identify anyone.
For more information on how these cookies work, please see our Cookies page.