Mastodon Kuan0: Data Protection & Digital Information (No.2) Bill - key changes from 2022 Bill No.1; GDPR comparisons

Monday, 13 March 2023

Data Protection & Digital Information (No.2) Bill - key changes from 2022 Bill No.1; GDPR comparisons

The UK Data Protection & Digital Information (No.2) Bill's key changes from the 2022 Bill,  compared with the EU GDPR, are summarised in the table below. 

After the table are some "But why didn't they do that?" questions, and "Will compliance with the EU GDPR really comply with the new Bill"?

Table of Key Changes

  • Only changes from the 2022 version are covered, and only those relating to GDPR (not law enforcement or intelligence services or the DVS trust framework).
  • Clarifications/typos/minor corrections and other minor textual changes are not covered.
  • The table below is also not a full comparison of the entire Bill against the EU GDPR.

Abbreviations

ADM
automated
decision-making
C
controller
ICO
UK Information
Commissioner's Office
P
processor
PD
personal data
S
UK Secretary of State
SRI
senior responsible individual

Issue Cf 2022 version Cf EU GDPR Comments/Queries
Personal data Tighter, as this specifically calls out the role of access protection measures.
It’s PD if C/P knows/ought reasonably to know another person obtains/is likely to obtain info as result of C/P processing and the individual is identifiable/likely to be identifiable by that person at the time of processing, (added) including if an unauthorised person obtains info due to the C/P not implementing appropriate measures to mitigate the risk of their obtaining the info.
Clarifies: identifiability  is assessed at the time of processing by C/P.

Focuses on whether info is PD in the hands of whoever processes it (similar to the position under DPA 1998).
Time of processing - time of whose processing, processing by C, P, either, the other person?

If an individual is identifiable to C but not P, or vice versa, does that make them identifiable to both?

Why not also mention measures to mitigate the risk of unauthorised persons identifying individuals (e.g. strong encryption), vs. their obtaining the info? Surely such measures are equally important: focus on either/or, not just “obtaining”?
Legitimate interests New Art.6(9) gives examples of types of processing that may be necessary for LI:
- Necessary for direct marketing (defined in both versions as communication (by whatever means) of advertising or marketing material which is directed to particular individuals, and now also to be inserted into Art.4(1)(15A) UK GDPR),
- Intragroup transmission necessary for internal admin, or
- Necessary for security of network and info systems
Much has been made of this. But actually it’s just based on GDPR Recs.47 last sentence, 48 & 49, putting them into the operative text. Just without the “strictly necessary”, which in my view is very tight particularly in relation to ensuring security.

However, "direct marketing" is defined more broadly than in say the European Commission and Council's approach in the draft ePrivacy Regulation - could it include targeted advertising on websites or mobile apps here? 
Pity that necessity for preventing fraud Rec.47 wasn’t included, or necessity for the security of PD (not just systems).

The scope of "direct marketing" would benefit from clarification, e.g. is "sent" intended or is displaying personalised ads on web/mobile enough to be "direct marketing"?


Scientific research Clarified:
- Even commercial activity can be scientific research
But activities only qualify if they can “reasonably described as scientific”
GDPR doesn’t define scientific research. The Bill just provides helpful clarifications, e.g. drawing on Rec.159 (GDPR doesn’t explicitly exclude commercial research and Art.89 of course requires safeguards there, which the Bill is changing). Processing PD for studies in the area of public health are “scientific” only if conducted in the “public interest” – clarify “public interest” here? But generally that phrase isn’t defined anywhere… and see queries after this table.
Statistical purposes Includes processing for statistical surveys or production of statistical results resulting in aggregate non-personal data, but (added) only if controller doesn’t use personal data processed or resulting information to support measures/ decisions regarding a particular data subject to whom the personal data relates Just clarifications, reflecting Rec.162 -
ADM Art.22A(2) no longer states that decisions include profiling.  I consider this to now reflect the correct interpretation, rather than a relaxation - see the next cell.

Instead, when considering whether there's meaningful human involvement, the extent to which the decision was reached by profiling must be considered among other things. That's one way to interpret the profiling reference in Art.22 and it makes some sense.

S may make regulations stipulating that certain cases do, or don't, have meaningful human involvement.
Clarifies the debated issue of whether Art.22 only gives rights to data subjects to object to ADM, or positively prohibits ADM.

Clarifies that decisions “based solely on automated processing” are those with “no meaningful human involvement”.

Clarifies role of profiling, in the debate on whether Art.22 catches profiling per se, or only profiling that leads to ADM (I believe the latter). So, Art.22A(2) now reflects what I feel is the correct interpretation.

A positive prohibition usefully clarifies the position. Similarly with the meaning of automated decisions.

Data subjects aren't deprived of rights regarding ADM, because the new Art.22C safeguards must enable data subjects to obtain human intervention and to contest decisions, and individuals can no doubt claim compensation for breach of this explicit prohibition. 

However, it's unclear why Sch.4 will omit s.14 DPA2018 altogether. Removing the notification requirement may reduce burdens on Cs, but retaining a positive obligation on Cs to consider requests to reconsider decisions could further help to show that data subjects do retain their ADM rights. Perhaps S regulations are intended to address this and other ADM-related issues?
ROPAs (records of processing activities) Needed only for processing which, taking into account its nature, scope, context and purposes, is likely to result in a high risk to the rights and freedoms of individual - instead of 2022 exemption for <250 employees unless likely to result in high risk

C records need include only categories of person with whom C shares PD, rather than named persons. However, there "recipients" has been changed to "persons" in third countries/ international organisations.

Amends Art.57(1)(k) to require the ICO to produce and publish a document containing examples of types of processing which it considers are likely to result in a high risk to the rights and freedoms of individuals (for the purposes of Articles 27A, 30A and 35) - i.e., senior responsible individualROPAs and assessment of high-risk processing. This helps ensure a consistent view of what is considered "high-risk" across these different areas.
Required for all Cs and Ps with exemption for <250 employees unless processing is likely to result in a risk to rights and freedoms of data subjects, is not occasional, or the includes special category or criminal-related data.

Changing "recipients" to "persons" actually goes broader than GDPR, as under GDPR Art.4(9) certain public authorities (again it's not entirely clear which) aren't considered "recipients", so this should be positive for UK adequacy as any sharing with public authorities must definitely be recorded.
Arguably, catching C/Ps even with <250 employees for high-risk processing would catch non-occcasional processing of special category or criminal-related data.

While it's "high-risk" vs. "a risk", the latter catches most C/Ps; some might it's say is too strict given realistic risks, especially under EDPB's broad interpretation of Art.30.5's "or". So the Bill is less strict than GDPR, but hopefully that's not significant enough to prejudice UK adequacy.

It's odd that the "categories" issue relates to C records (Cs will surely know those they share PD with), rather than DSARs/privacy notices - could the change have been intended for the latter, but inadvertently got inserted here instead?
DPIAs (assessment of high-risk processing) Deleted ICO's Art.35(4)-(5) obligation to publish list of operations requiring DPIA and power to publish list of operations not requiring assessment.

But, see above on the amended Art.57(1)(k) which effectively does the same thing, except that there's no longer power to publish lists of operations not requiring assessment.
No explicit requirement to consult DPO. However, arguably this is implicit in new Art.27B(2)(c), informing/advising of data protection obligations.

No Art.35(3) criteria deeming certain types of processing always to be high risk (ADM, large-scale processing of special category/criminal-related data and large-scale systematic monitoring of publicly accessible areas!)

The related Art.36 makes prior consultation with ICO optional, but see LinkedIn discussion in comments on whether this makes much difference in practice.

The legislative aim to require assessments for high-risk processing remains, in substance.

I suspect the ICO's list of high-risk processing will include the Art.35(3) types! In which case, little difference in practice, but more flexibility.

Oddly, there's no explicit power for the ICO to publish lists of activities that are not considered to require assessment as high-risk.

Senior responsible individual To be designated by public body or likely high-risk processing, but note the amended Art.57(1)(k) regarding an ICO list to be published of what's high-risk processing for this purpose.

(The ICO's "high-risk" lists could theoretically be different for SRI, high-risk assessments and ROPA purposes, but they may not be - consistency will be helpful here.)
No more Art.37(1)(b)-(c) criteria deeming certain types of processing always to require a DPO (core activities involve large-scale regular and systematic monitoring or processing special category/criminal-related data).

The individual must be part of the organisation’s senior management which arguably goes beyond GDPR. Allowing job-sharing here is enlightened. SRI details must be notified to the ICO.

However, there's no longer any "sharing" allowed ot the SRI across different public authorities or a related group. 
Given the SRI must be designated in high-risk processing situations, and issues like resourcing and conflicts are clearly covered, is there much difference in practice?

Again, I suspect the ICO's list of high-risk processing here will include the Art.37(1)(a) and (b) types! In which case, again, little difference in practice, but more flexibility.

No SRI sharing could cause practical problems given the difficulties with recruiting people with data protection expertise!

"Outsourcing" of SRI functions might perhaps still be possible as the SRI can alternatively "secure" that certain tasks are performed by another, taking into account expertise etc. Probably SRIs without sufficient privacy expertise (yet!) will have to secure another person (which doesn't seem limited to internal staff) to perform at least some tasks.
Transfers (data exports) New transitional provisions to "grandfather" valid transfer mechanisms in place before the relevant Bill provisions take effect. Comparing the transfers provisions generally, e.g. "not materially lower" vs "essentially equivalent", merits a note in itself, and will not be discussed here! Not discussed here. And it will be up to the European Commission to assess the extent to which these and other changes may affect UK adequacy!


But why didn't they do that?

While the following are points where the 2022 and 2023 versions of the Bill don't differ, some queries spring to mind:

  1. Research processing of special category/criminal-related data - under DPA2018 Sch.1 para.4, such processing is permitted if it's necessary for archiving purposes, scientific or historical research purposes or statistical purposes, is carried out in accordance with Article 89(1) [to be the new Art.84B i.e. safeguards], and is in the public interest. Here, the UK went beyond GDPR, because the "public interest" requirement doesn't appear in Art.9(2)(j). National law permitting such processing just has to be "proportionate to the aim pursued, respect the essence of the right to data protection and provide for suitable and specific measures to safeguard the fundamental rights and the interests of the data subject". Presumably it's a UK policy decision to require the "public interest" condition in addition? If so, giving examples or defining "public interest" here would be helpful as it's such a vague and broad term.

  2. AI bias and anti-discrimination - the June 2022 consultation response intended to expand the DPA2018 sch.1 para.8 exemption, allowing processing of special category data and criminal offence-related data for equality of opportunity or treatment, to permit bias monitoring, detection and correction in AI systems. Surely this is a laudable aim that no one should object to, so it's not clear why this update didn't make it into the Bill?

  3. PECR/cookies
    1. Security - the Bill will allow storage/access to ensure security of the terminal equipment, but why not security of networks/data more broadly given the critical importance of security generally?
    2. Analytics - the Bill would allow first party analytics, but it seems not the use of a third party analytics service, as sharing with third parties is allowed only to enable them to "assist with making improvements to the service or website" - why not also to enable them to assist with collecting that information? SMEs in particular won't have technical expertise to install their own on-prem inhouse analytics solutions, so not including "or collecting that information" there may undermine the legislative objective of easing web/mobile analytics for organisations. 

BTW, on DSARs' change from "manifestly unfounded or excessive" to "vexatious or excessive" - the latter phrase has been much discussed (including at regulatory and judicial level), and therefore is well understood in the UK, in the FOI (freedom of information) context. See also the discussion on this in LinkedIn, in the comments section

Interestingly, the first version of the press release said "Ministers have co-designed the Bill with key industry and privacy partners - including Which? and TechUK..." but the current press release no longer mentions Which?. Input from consumer organisations is obviously important in this context.

Will compliance with the EU GDPR really comply with the new Bill?

I spotted one minor example where strictly, it won't.

Privacy notices will have to include info about the right to complain to the controller, under the Bill. GDPR privacy notices needn't.

But, as per statements at the IAPP UK Intensive on 8 Mar 23, it's very unlikely that the ICO would fine or enforce against Cs lacking that one line (it'll just say, add that in)! And obviously including that extra info won't cause any issues under the EU GDPR.