10/07/2024

ICO Update - Enforcement Action / Reprimands

Action related to security breaches

Monetary penalties

Other ICO News

What is on the ICO’s regulatory radar?

Consultations

New Guidance

Law and Policy Update

Other

 

1.    ICO Update - Enforcement Action / Reprimands

Action related to security breaches

ICO issues reprimands to organisations for inadvertently making the content public

The ICO has issued reprimands to the Mayor’s Office for Policing and Crime (MOPAC) and Clyde Valley Housing Association for inadvertently making personal data, contained within webforms and a customer portal, to be accessible to the wider public. 

Whilst MOPAC dealt with the breach quickly, Clyde Valley Housing Association was criticised because it delayed in its response and data remained viewable for five days after the breach was first reported.

These cases highlight the importance of undertaking thorough testing when launching new products and services, as well as undertaking adequate staff training around granting permissions on webforms, online portals and other similar databases. Organisations should also ensure they have a proper escalation process for breach notifications in place. 

The Central Young Men’s Christian Association receives a reprimand and monetary penalty for using ‘CC’ rather than ‘BCC’ when sending email 

The Central Young Men’s Christian Association (Central YMCA) sent an email to individuals participating in a programme for people living with HIV. The use of “CC” rather than “BCC” meant that 166 individuals could be identified, or potentially identified, from their email address. The Central YMCA have been fined £7,500 and issued a reprimand.

This should serve as a warning to organisations using mass emails to send information to recipients. In this case, even though the email itself did not contain personal data, the combination of the content of the email along with the recipient list meant it was possible to infer that the recipients were each living with HIV. This could be avoided by undertaking some basic staff training and offering simple guidance.

The level of fine here is relatively low given the severity of the breach, reflecting the ICO’s more lenient approach to enforcement action against public bodies, charities and not-for-profit organisations. 

Chief Constable of Kent Police issued reprimand for not explaining how personal data would be processed

Kent Police have been reprimanded following an incident in which a police officer took a photograph of an individual’s identity document using a personal mobile phone and uploaded the image onto ‘Telegram’, a social media application. It appears the Telegram distribution group, onto which the image was uploaded, was being used by multiple UK police forces and international law enforcement agencies for the purpose of combatting vehicle crime. 

The police officer in question did not inform the individual that further processing of his personal data would take place; how it would be processed; or the purpose for doing so. Additionally Kent Police ‘failed to ensure officers were adequately informed that the use of personal devices to process data obtained in their official duties was not acceptable.’

It is important to ensure that individuals are made fully aware of how their personal data will be processed. It may not always be possible to explain this to each individual every time, but organisations should ensure they have robust privacy policies in place which clearly set out all the processing activities which are taking place.

A further consideration here should be around the use of personal devices in the workplace. Organisations must have clear policies in place which deal with when and how personal devices may be used. Please get in touch if you would like assistance with putting an appropriate policy in place.


Monetary penalties 

In April 2024, the ICO issued two sets of enforcement notices and monetary fines for infringement of the Privacy and Electronic Communications Regulations (EC Directive) 2003 (PECR). While most practitioners will be acutely aware of the Data Protection Act 2018 (DPA 2018) and the UK General Data Protection Regulation, this set of fines is a useful reminder that PECR nevertheless remains a significant factor to consider, especially as this remains a real focus area for enforcement action by the ICO and there appears to be very limited tolerance for non-compliance.

Outsource Strategies

Outsource Strategies was held to be responsible for just over 1 million unwanted marketing calls between 11 February 2021 and 22 March 2022, to numbers registered with the Telephone Preference Service (TPS), resulting in 74 complaints to the ICO. Following investigation, Outsource Strategies was unable to provide sufficient explanation for these breaches, and the ICO issued a fine of £240,000, as well as an enforcement notice requiring Outsource Strategies to i) not call individuals who have opted out of calls or who are registered with the TPS; and ii) where calls are made, caller details are provided. In effect, point ii) ensures that the ICO can resort to more significant compliance tools if Outsource Strategies fails to remedy its breaches.

Dr Telemarketing

Between 11 February 2021 and 24 January 2022, Dr Telemarketing carried out 80,240 connected direct marketing calls to TPS subscribers, receiving two complaints in response. Following investigation, it received a fine of £100,000, and an enforcement notice requiring Dr Telemarketing to not call individuals who have opted out of calls or who are registered with the TPS.

Overall, this pair of actions serves as a timely reminder for organisations to be extra cautious to ensure their compliance in this area and ensure TPS screening is taking place prior to making any marketing calls. 

Please get in touch if you would like us to advise your organisation further on its marketing practices and compliance with the complex rules in this area.

 

2.    Other ICO News

What is on the ICO’s regulatory radar?

Information Commissioner calls for senior leads to take FOIA obligations and transparency seriously 

On the 4th March, the ICO announced a new wave of FOI actions, against: Sussex Police; South Yorkshire Police; the Department of Education (DfE); the Foreign Commonwealth and Development Office (FCDO); and the Financial Ombudsman Service (FOS). 

Sussex Police and South Yorkshire Police received enforcement notices, with the ICO noting in particular that the latter’s FOI request response rate was “unacceptable on any level”, at a response rate of less than 18 percent within the initial 20 working day deadline. Sussex Police was reported to have a response rate of 27 percent for requests within the initial 20 working day deadline. Both organisations had significant backlogs, and are required to handle all requests older than 20 working days by the 31st August 2024; this is likely to be a significant task for both, especially where their existing FOI teams are likely to be competing for resources for day to day demands.

In comparison, the DfE, FCDO and FOS had under 80 percent, under fifty five percent, and under 80 percent requests handled in time, respectively.

In the wake of these actions, it’s clear that FOI resourcing pressures are being felt across the public sector as a whole. Nevertheless, the ICO will continue to pursue poor performance, and organisations should strongly consider preventative measures before being forced to take more significant and comprehensive measures at short notice and with less chance for planning/budget forecasting.

We have significant experience in this area, so please do get in touch to see how we can assist in clearing any FOIA response backlog you may have. 

ICO sets out priorities to protect children’s privacy online

The ICO has confirmed its strategy to improve the protection of children’s privacy online and has set out its focus over the coming months on driving social media and video sharing platforms to make more progress to safeguard children’s personal data, given the significant use of these platforms by children. 

The ICO notes in particular that processing of children’s personal information may not always be in their best interests on social media and video sharing platforms and can increase the potential for significant harm to children. Areas of particular concern include default privacy and geolocation settings, profiling children for targeted advertisements, using children’s information in recommender systems and using information of children under 13 years old. 

As a consequence, the ICO will be aiming to identify the most serious risks to children’s privacy in these areas and working to reduce or eliminate them. This will include gathering evidence to further their understanding of this area and identifying the key social media sites they will engage with, engaging with relevant stakeholders, and using supervision and enforcement powers.

Organisations processing children’s personal data should be mindful of the ICO’s strategy in this area and monitor any findings that the ICO makes to ensure continuing compliance. 

ICO to investigate 23andMe data breach with Canadian counterpart 

The ICO, alongside the Canadian regulator (Office of the Privacy Commissioner) are launching a joint-investigation into ‘23andMe’.

This follows the data breach that occurred in October 2023, where the genomics testing company was hacked and a significant amount of raw data was dumped onto internet forums – exposing information as sensitive as the full names, sex and ethnic and/or religious background of approximately 1 million ‘23andMe’ customers.

The ICO, alongside the Canadian regulator, aims to examine the scope of information exposed by the breach, alongside potential harms; whether ‘23andMe’ has adequate safeguards to prevent such a data breach; and whether or not ‘23andMe’ provided adequate notification about the breach to both regulators and effected parties.

This story further highlights the necessity for organisations, especially those handling people’s most sensitive data, to invest in information security and compliance. 

 

Consultations

ICO launches “consent or pay” call for views 

On 6 March 2024, the ICO called for views on the controversial “consent or pay” model which offers the choice to individuals of accessing a service at no additional costs if consent is provided to personal data being used for personalised advertising, or if consent is not provided, paying a fee, or an increased fee, to access the service. 

The ICO notes in its initial views and guidance that, in principle, ‘consent or pay’ mechanisms are not unlawful, but organisations using this mechanism must ensure that consent to the processing of personal information for personalised advertising is freely given and fully informed, with the ability to withdraw consent. The ICO expects organisations at a minimum to consider the following factors before introducing any ‘consent or pay’ mechanism:

  • The balance of power – Consent is unlikely to be freely given when people have little choice over whether or not to use a service – for example if a provider occupies a dominant market position; 
  • Equivalence – Is the ad-funded service and the paid for service essentially the same?
  • Appropriate fee – Consent is unlikely to be freely given if the requested fee is disproportionate; and
  • Privacy by Design – Is the choice presented fairly and equally with clear, understandable information?

Note, guidance provided by the ICO on this topic is its ‘emerging thinking’ and cannot at this stage be relied upon to ensure compliance with data protection law. The outcome of this consultation will be of particular interest to those clients that are ad-funded and that rely on online advertising. We will continue to provide updates on the model in this publication.  

Information Commissioner’s Office seeks views on accuracy of generative AI models

In April 2024, the ICO launched its latest consultation, looking at the development and use of generative AI (‘GenAI’). This consultation was particularly focused on accuracy and how it relates to the outputs of GenAI mdoels, and the requirement for accurate training data.

Accuracy will be one of the key factors for any use of GenAI, in particular for business purposes. An inaccurate GenAI will either need substantial human monitoring and correcting, or else could expose a client to significant legal liability – such as a US car dealership’s chatbot which started selling cars for $1. More practically, GenAI used for activities such as reviewing scans, medical records, files or similar to try and draw conclusions, will open up significant exposure to liability where key conclusions or findings are missed.

In order to develop accuracy, AI developers will have to consider both the quality of the data used to train the AI, and the accuracy of the GenAI algorithm itself. For companies looking to implement GenAI systems, they will need to ensure they have access to sufficient (internal or external) skills to vet and understand accuracy of systems. This may include running their own tests, or engaging with developers to understand their reported results and techniques. As GenAI rolls out, such skills and understanding will be key for understanding the uses of GenAI and the risk (or lack thereof) that attaches to its uses.

 

New Guidance

ICO publishes new fining guidance 

On 19 March 2024, the Information Commissioner’s Office published new data protection fining guidance setting out how it decides to issue penalties and calculate fines under the UK GDPR. Publication of the guidance follows a consultation last year, where views were gathered on a draft version. The new guidance replaces the sections about penalty notices in the ICO Regulatory Action Policy published in November 2018. 

The guidance includes a 5 step procedure setting out the factors that are taken into account when determining how serious an infringement is, including mitigating and aggravating factors. No concrete statements are included in the guidance and it is made clear infringements will continue to be considered on a case-by-case basis with wide discretion exercised by the ICO as opposed to a formulaic approach being applied. 

The guidance does however provide a good indication as to what might be considered as significant by the ICO in attracting an increased or decreased fine, for example the guidance makes clear:

  • when considering the number of data subjects affected it will also consider those potentially affected;
  • that systematic and extensive profiling could increase the seriousness of the infringement;
  • discrimination is a type of harm that will be considered when levying a fine;
  • the types of data that could increase the seriousness of a breach. Location data, private communications data, passport details and financial data are given as examples of data that may be regarded as particularly sensitive;
  • mitigating factors that arise before the ICO is made aware of a breach are more likely to be taken into account by them; and
  • delayed responses to the ICO could increase the seriousness.

Organisation will likely want to consult this guidance when trying to calculate financial risk in connection with processing activities and breaches. 

ICO publishes guidance to improve transparency in health and social care

On the 15th April, the ICO released sector-specific guidance covering transparency in the health and social care sectors. The guidance is significant in particular for organisations in the social care sector, where residential homes have recently faced renewed interest from the ICO, and for the health sector which is a perennial area of focus for the ICO more generally due to the amount of sensitive personal data it relies on.

The guidance includes:

  • Why transparency is important;
  • How should you provide transparency and privacy information; and
  • How to assess if you are being transparent.

The guidance in particular focuses on the need to ensure your transparency obligations are fulfilled, and notes that the effort required to fulfil those obligations will depend on the nature of your processing and the nature of the data subjects in question. In the health and social care sectors, the personal data in question will often be quite sensitive, and data subjects will often be vulnerable (for example children, the elderly or individuals lacking mental capacity to some degree), or will otherwise be distressed and may require simpler and more understandable phrasing. 

Transparency, while being a key data protection principle, will also often be the most effective way in heading off complaints from data subjects – by ensuring subjects understand how their personal data will be used, they are less likely to complain or challenge that processing. Complying with this new guidance will therefore be both a regulatory and pragmatic good for organisations.

Director’s update: the FOI year in review

Warren Seddon the ICO’s Director of Freedom of Information and Transparency published an update on 24 April 2024 summarising the last year and the year ahead for the FOI team. The update makes clear the ICO will continue to prioritise FOIA as an essential part of delivering its ICO25 commitments to promote openness and transparency across the public sector. 

Seddon comments on how his team aims to make simple changes to improve its service such as the introduction of the requirement for public bodies to respond more quickly to decisions taken by the ICO in response to public complaints. He states “from now on, where we find a public body has got things wrong, they will have 30 days to comply with a decision, bringing this into line with information notices”.

Seddon notes in particular the sheer volume of cases coming to the team, they received more than 8,000 complaints last year against a previous high of 6,418 in 2018/19. It is acknowledged that the pressure faced by the ICO in this area is replicated in the volume of requests public authorities are receiving and the effort it takes to support FOI rights and responding to ICO complaints. With this in mind, FOI practitioners are encouraged in the publication to respond to the ICO’s survey to assess feedback on the upstream tools and guidance offered by the ICO in this area.

Over the year ahead and to help manage the significant number of cases, the ICO is looking to continue to use its powers in more innovative ways. Examples of this include sampling performance in higher education institutions, which led to engagement with a university to highlight unacceptable performance resulting in swift action by the university to clear a significant backlog without the need for enforcement action. The ICO has also worked with legal colleagues to develop a process to use powers to refer a public authority for contempt of court if they fail to comply with a statutory decision or notice and the ICO is now in a better position to pursue this route if necessary. 

A joint statement by Ofcom and the Information Commissioner’s Office on collaboration on the regulation of online services

On 1 May 2024 the ICO and Ofcom published a joint statement setting out how they will collaborate to implement the vision set out in their 2022 statement for a clear and coherent regulatory landscape for online services which complies with both regimes. 

The ICO and Ofcom plan to identify and regularly review collaboration themes which are of relevance to both data privacy and online safety. The statement identifies the initial collaboration themes and confirms that staff from both regulators will meet on a regular basis to discuss these themes and identify emerging Collaboration Themes.

The initial collaboration themes are:

  • Age assurance – this includes age verification and age estimation so that services can be tailored to the user and restrictions put in place based on age.
  • Recommender systems – these are systems that use algorithms to curate recommendations of content to users.
  • Proactive tech and relevant AI tools – Proactive tech is defined in the Online Safety Act 2023 (OSA) and essentially covers content identification, user profiling and behavioural identification technologies.
  • Default settings and geo location settings for children – these cover services’ default settings for children.
  • Online safety privacy duties – the OSA requires services to have regard to the importance of protecting users from a breach of privacy when setting standards and policies.
  • Upholding terms, policies and community standards – both the OSA and ICO’s children code contain requirements as to a service’s terms and conditions. 

These themes indicate where Ofcom and the ICO are likely to focus their attention in the coming months. It would be prudent for online service providers to proactively assess their approach in these areas to ensure that they are ready for scrutiny.

There is also a new joint advisory service which has been launched by the Digital Regulation Cooperation Forum to support innovators by providing informal advice on complex regulatory questions that cross more than one of the regulator’s remits. By submitting queries through the hub organisations can receive co-ordinated and tailored advice from both the ICO and Ofcom. 

Learning from the mistakes of others – A retrospective review

This ICO review published on 10 May 2024, highlights through the use of case studies, the most significant threats to securing information and safeguarding people and supports organisations in improving knowledge of common security pitfalls. The review focuses on the main causes of security breaches aside from ransomware. These include, phishing, brute force attacks, denial of service, errors and supply chain attacks. The guidance provides a helpful summary of each of these, including key principles to consider when trying to mitigate or reduce the level of harm and possible developments that might have an impact in these categories in the future. The report can be read in full here

ICO Statement on its public sector approach trial

In June 2022 the ICO started a two year trial on how they worked with public bodies, reducing the use of fines and adopting alternative measures so as to lessen the dent on the public purse.

On 26 June 2024, the ICO issued a statement confirming that while they have continued to issue fines where appropriate, they have also used alternative regulatory tools to enforce the law to ensure that money isn’t diverted from where it is needed the most. 

The ICO is now reviewing the two-year trial and will continue to apply the approach taken over the past two years until the review concludes in autumn. At this point the ICO will make a decision on what approach will be taken going forwards. 

 

3.    Law and Policy Update

Social Tenant Access to Information Requirements (STAIRs) consultation begins

On 20th May 2024, the STAIRs consultation opened. The aim of STAIRs is to achieve greater transparency from registered providers (RPs) of social housing, and to ensure that tenants are able to access information about the management of their housing. The consultation puts forward a proposed standard to achieve this.

The publication scheme: RPs will be required to proactively publish certain information relating to their management of social housing, as set out in the publication scheme. RPs will be required to review and update that information on a regular basis, and to ensure it is accessible to all of their tenants. Additionally, where information is provided to a tenant in response to a request for information (as per the below), RPs should consider whether this should also be published under the publication scheme.

Requests for information: Only tenants and their representatives will be able to access information. Where a representative is being designated, the tenant must identify that representative to their RP. Requests must be made in writing. Information requests must relate to issues relevant to the management of social housing. Requests must be dealt with within 30 calendar days from receipt, except in exceptional circumstances.

Subcontractors: Where information is held by an RP’s subcontractor, the RP will be required to use ‘all reasonable endeavours’ to obtain the information from the subcontractor and provide it to the tenant. Consequently, RPs should review their contracts with subcontractors in order to ensure they contain an obligation to comply with this principle.

Grounds for withholding information: The consultation sets out a number of procedural grounds on which information may be withheld. Additionally, the types of information which would be exempt from disclosure under data protection legislation, including the Freedom of Information Act, would also be exempt from disclosure under STAIRs.

Complaints: Complaints should first be raised with the RP. If the individual is still dissatisfied, this can then be escalated to the Housing Ombudsman. Where appropriate, the Ombudsman will also be able to refer issues indicative of systemic failure to the Regulator to investigate further, as it would for other matters.

TMOs: The consultation recognises that tenant management organisations (TMOs) fall in a middle ground and tenants of TMOs may have difficulties accessing information from their landlord. Consequently, the consultation is seeking views on whether STAIRs should be extended to include information held by TMOs who are contracted to manage housing on behalf of local authorities. Local authority tenants are already able to access information under the Freedom of Information Act.

The consultation closes on 15 July 2024.

Get in touch with our Information Law & Privacy team if you would like to discuss how your organisation can prepare for the introduction of STAIRs.

Social Tenant Access to Information Requirements: consultation

Experian appeal dismissed

The ICO issued an enforcement notice in October 2020 against Experian for violations of the transparency requirements under the GDPR following a two-year investigation into how the company and two other major credit reference agencies were using the personal information of UK adults for direct marketing purposes. Experian appealed the notice and the First-Tier Tribunal (FTT) significantly modified the terms of the enforcement notice holding largely in favour of Experian’s appeal. 

The ICO appealed the decision of the FTT in the Upper Tribunal claiming the FTT had made a number of errors of law in reaching its decision. The Upper Tribunal dismissed the ICO’s appeal on 23 April 2024. Stephen Bonner, ICO Deputy Commissioner, said: “It is regrettable that the flaws identified by the Upper Tribunal [have not extended to] overturning the First-tier Tribunal’s judgment, but there is definite value on the clarity we have now received on several key points. The ICO will take stock of today’s judgment and carefully consider our next steps, including whether to appeal."

The ruling provides guidance on the transparency principle under the UK GDPR and in particular how these apply to individuals whose data has been obtained indirectly from third parties. The Upper Tribunal noted that the requirements for transparency from case to case will be context specific and underpinned by considerations of proportionality. The list of information to provide to data subjects in Article 13 and 14 of the UK GDPR is just the ‘basic minimum’ that needs to be provided, however more information may need to be provided depending on the circumstances to comply with the transparency principle. 

DPDI Bill Update

The long awaited Data Protection and Digital Information (DPDI) Bill has been dropped, following the announcement of the general election. 

The DPDI Bill had passed through the House of Commons and completed the committee stage in the House of Lords, with the report stage due. Because the DPDI Bill had not passed through both Houses, it was not able to be dealt with in the wash-up period, before Parliament was prorogued on 24th May 2024.

Data Matters will continue to monitor the DPDI Bill, and whether it is reintroduced by the next government.

 

4.    Other

Five Top Tips for Data Security

In a time of increased numbers of cyber-attacks, we’ve put together our five top tips that your organisation can put in place now.

1. Update your software

Software updates usually contain a fix for a known vulnerability and can have a significant impact on minimising your risk. Cyber attackers will often use a known weakness in a particular application to get access to your systems, so the sooner you apply the “fix”, the lower the risk of an attack.

2. Know what you’re using, how it’s linked, and whether it’s supported

We are all used to using multiple different “applications” in our daily lives. Understanding which systems are linked, and what information is used by each application, helps understand where the high risk data is, where the crucial entry points are (where someone might be able to access most of your systems if they get in via one particular system), and therefore where to focus your efforts.

3. Train your staff

Phishing, where an email is sent with the purpose of encouraging an individual to click on a link or enter their passwords so that malware is downloaded, or passwords are revealed, is the number one cause of cyber security events. Training makes an individual less likely to click a suspicious link, or open a suspicious email, which, in turn, cuts down the risk of a successful phishing attack. 

4. Put a disaster recovery plan in place

A disaster recovery plan (or business continuity plan), sets out what happens in the event that you suffer a major cyber-attack. It should be a practical document that any individual can follow to ensure that all the relevant people are made aware within set time frames, who is responsible for what, and what steps you will take. Having some simple steps and clearly defined responsibilities can really help when the initial panic sets in.

5. Back up your information

Make sure that your systems are backed-up regularly, and you know where the back-up is, and how to access it. 

If you would like our assistance with training, or putting together a response plan, please get in touch.

 

Our use of cookies

We use necessary cookies to make our site work. We'd also like to set optional analytics cookies to help us improve it. We won't set optional cookies unless you enable them. Using this tool will set a cookie on your device to remember your preferences. For more detailed information about the cookies we use, see our Cookies page.

Necessary cookies

Necessary cookies enable core functionality such as security, network management, and accessibility. You may disable these by changing your browser settings, but this may affect how the website functions.

Analytics cookies

We'd like to set Google Analytics cookies to help us to improve our website by collection and reporting information on how you use it. The cookies collect information in a way that does not directly identify anyone.
For more information on how these cookies work, please see our Cookies page.