Mastodon Kuan0: 2023

Wednesday, 23 August 2023

Age assurance/verification technologies & privacy/data protection

Key ICO resources and UK info/standards on age checking/assurance & the Children's Code are below.

ICO work to date on children's privacy and age estimation/verification:

Also relevant:

Added May/June 2024:

Sunday, 23 July 2023

Windows: try local LLMs easily

1. Download kobold.cpp.exe from https://github.com/LostRuins/koboldcpp/releases (I picked the latest version)

2. Download the GGML BIN file for the model(s) you want to use - you can get Llama2 models from https://huggingface.co/TheBloke/Llama-2-7B-Chat-GGML - check for which large language models/LLMs are compatible with Kobold, go to the Files tab to find and download the one(s) you want.


3. For command line avoiders, just doubleclick koboldcpp.exe. A command line interface window and GUI window open up



4. In the GUI window click Model, Browse, select one of the downloaded GGML BIN files then click Launch



5. In your default browser a new tab should open, if not just open a tab yourself and go to http://localhost:5001/ and prompt away!


6. The command line window stays open, with info on the input prompts, output, processing time etc. Just close it and the browser tab when done. All data stays local to your computer, inputs and outputs etc.

NB. you need a lot of RAM, especially for the bigger models.

Thanks to Autumn Skerritt's helpful blog (which also covers Mac & Linux and has other useful info) - I just added info on the GUI and other possible downloadable models I found.




Monday, 13 March 2023

Data Protection & Digital Information (No.2) Bill - key changes from 2022 Bill No.1; GDPR comparisons

The UK Data Protection & Digital Information (No.2) Bill's key changes from the 2022 Bill,  compared with the EU GDPR, are summarised in the table below. 

After the table are some "But why didn't they do that?" questions, and "Will compliance with the EU GDPR really comply with the new Bill"?

Table of Key Changes

  • Only changes from the 2022 version are covered, and only those relating to GDPR (not law enforcement or intelligence services or the DVS trust framework).
  • Clarifications/typos/minor corrections and other minor textual changes are not covered.
  • The table below is also not a full comparison of the entire Bill against the EU GDPR.

Abbreviations

ADM
automated
decision-making
C
controller
ICO
UK Information
Commissioner's Office
P
processor
PD
personal data
S
UK Secretary of State
SRI
senior responsible individual

Issue Cf 2022 version Cf EU GDPR Comments/Queries
Personal data Tighter, as this specifically calls out the role of access protection measures.
It’s PD if C/P knows/ought reasonably to know another person obtains/is likely to obtain info as result of C/P processing and the individual is identifiable/likely to be identifiable by that person at the time of processing, (added) including if an unauthorised person obtains info due to the C/P not implementing appropriate measures to mitigate the risk of their obtaining the info.
Clarifies: identifiability  is assessed at the time of processing by C/P.

Focuses on whether info is PD in the hands of whoever processes it (similar to the position under DPA 1998).
Time of processing - time of whose processing, processing by C, P, either, the other person?

If an individual is identifiable to C but not P, or vice versa, does that make them identifiable to both?

Why not also mention measures to mitigate the risk of unauthorised persons identifying individuals (e.g. strong encryption), vs. their obtaining the info? Surely such measures are equally important: focus on either/or, not just “obtaining”?
Legitimate interests New Art.6(9) gives examples of types of processing that may be necessary for LI:
- Necessary for direct marketing (defined in both versions as communication (by whatever means) of advertising or marketing material which is directed to particular individuals, and now also to be inserted into Art.4(1)(15A) UK GDPR),
- Intragroup transmission necessary for internal admin, or
- Necessary for security of network and info systems
Much has been made of this. But actually it’s just based on GDPR Recs.47 last sentence, 48 & 49, putting them into the operative text. Just without the “strictly necessary”, which in my view is very tight particularly in relation to ensuring security.

However, "direct marketing" is defined more broadly than in say the European Commission and Council's approach in the draft ePrivacy Regulation - could it include targeted advertising on websites or mobile apps here? 
Pity that necessity for preventing fraud Rec.47 wasn’t included, or necessity for the security of PD (not just systems).

The scope of "direct marketing" would benefit from clarification, e.g. is "sent" intended or is displaying personalised ads on web/mobile enough to be "direct marketing"?


Scientific research Clarified:
- Even commercial activity can be scientific research
But activities only qualify if they can “reasonably described as scientific”
GDPR doesn’t define scientific research. The Bill just provides helpful clarifications, e.g. drawing on Rec.159 (GDPR doesn’t explicitly exclude commercial research and Art.89 of course requires safeguards there, which the Bill is changing). Processing PD for studies in the area of public health are “scientific” only if conducted in the “public interest” – clarify “public interest” here? But generally that phrase isn’t defined anywhere… and see queries after this table.
Statistical purposes Includes processing for statistical surveys or production of statistical results resulting in aggregate non-personal data, but (added) only if controller doesn’t use personal data processed or resulting information to support measures/ decisions regarding a particular data subject to whom the personal data relates Just clarifications, reflecting Rec.162 -
ADM Art.22A(2) no longer states that decisions include profiling.  I consider this to now reflect the correct interpretation, rather than a relaxation - see the next cell.

Instead, when considering whether there's meaningful human involvement, the extent to which the decision was reached by profiling must be considered among other things. That's one way to interpret the profiling reference in Art.22 and it makes some sense.

S may make regulations stipulating that certain cases do, or don't, have meaningful human involvement.
Clarifies the debated issue of whether Art.22 only gives rights to data subjects to object to ADM, or positively prohibits ADM.

Clarifies that decisions “based solely on automated processing” are those with “no meaningful human involvement”.

Clarifies role of profiling, in the debate on whether Art.22 catches profiling per se, or only profiling that leads to ADM (I believe the latter). So, Art.22A(2) now reflects what I feel is the correct interpretation.

A positive prohibition usefully clarifies the position. Similarly with the meaning of automated decisions.

Data subjects aren't deprived of rights regarding ADM, because the new Art.22C safeguards must enable data subjects to obtain human intervention and to contest decisions, and individuals can no doubt claim compensation for breach of this explicit prohibition. 

However, it's unclear why Sch.4 will omit s.14 DPA2018 altogether. Removing the notification requirement may reduce burdens on Cs, but retaining a positive obligation on Cs to consider requests to reconsider decisions could further help to show that data subjects do retain their ADM rights. Perhaps S regulations are intended to address this and other ADM-related issues?
ROPAs (records of processing activities) Needed only for processing which, taking into account its nature, scope, context and purposes, is likely to result in a high risk to the rights and freedoms of individual - instead of 2022 exemption for <250 employees unless likely to result in high risk

C records need include only categories of person with whom C shares PD, rather than named persons. However, there "recipients" has been changed to "persons" in third countries/ international organisations.

Amends Art.57(1)(k) to require the ICO to produce and publish a document containing examples of types of processing which it considers are likely to result in a high risk to the rights and freedoms of individuals (for the purposes of Articles 27A, 30A and 35) - i.e., senior responsible individualROPAs and assessment of high-risk processing. This helps ensure a consistent view of what is considered "high-risk" across these different areas.
Required for all Cs and Ps with exemption for <250 employees unless processing is likely to result in a risk to rights and freedoms of data subjects, is not occasional, or the includes special category or criminal-related data.

Changing "recipients" to "persons" actually goes broader than GDPR, as under GDPR Art.4(9) certain public authorities (again it's not entirely clear which) aren't considered "recipients", so this should be positive for UK adequacy as any sharing with public authorities must definitely be recorded.
Arguably, catching C/Ps even with <250 employees for high-risk processing would catch non-occcasional processing of special category or criminal-related data.

While it's "high-risk" vs. "a risk", the latter catches most C/Ps; some might it's say is too strict given realistic risks, especially under EDPB's broad interpretation of Art.30.5's "or". So the Bill is less strict than GDPR, but hopefully that's not significant enough to prejudice UK adequacy.

It's odd that the "categories" issue relates to C records (Cs will surely know those they share PD with), rather than DSARs/privacy notices - could the change have been intended for the latter, but inadvertently got inserted here instead?
DPIAs (assessment of high-risk processing) Deleted ICO's Art.35(4)-(5) obligation to publish list of operations requiring DPIA and power to publish list of operations not requiring assessment.

But, see above on the amended Art.57(1)(k) which effectively does the same thing, except that there's no longer power to publish lists of operations not requiring assessment.
No explicit requirement to consult DPO. However, arguably this is implicit in new Art.27B(2)(c), informing/advising of data protection obligations.

No Art.35(3) criteria deeming certain types of processing always to be high risk (ADM, large-scale processing of special category/criminal-related data and large-scale systematic monitoring of publicly accessible areas!)

The related Art.36 makes prior consultation with ICO optional, but see LinkedIn discussion in comments on whether this makes much difference in practice.

The legislative aim to require assessments for high-risk processing remains, in substance.

I suspect the ICO's list of high-risk processing will include the Art.35(3) types! In which case, little difference in practice, but more flexibility.

Oddly, there's no explicit power for the ICO to publish lists of activities that are not considered to require assessment as high-risk.

Senior responsible individual To be designated by public body or likely high-risk processing, but note the amended Art.57(1)(k) regarding an ICO list to be published of what's high-risk processing for this purpose.

(The ICO's "high-risk" lists could theoretically be different for SRI, high-risk assessments and ROPA purposes, but they may not be - consistency will be helpful here.)
No more Art.37(1)(b)-(c) criteria deeming certain types of processing always to require a DPO (core activities involve large-scale regular and systematic monitoring or processing special category/criminal-related data).

The individual must be part of the organisation’s senior management which arguably goes beyond GDPR. Allowing job-sharing here is enlightened. SRI details must be notified to the ICO.

However, there's no longer any "sharing" allowed ot the SRI across different public authorities or a related group. 
Given the SRI must be designated in high-risk processing situations, and issues like resourcing and conflicts are clearly covered, is there much difference in practice?

Again, I suspect the ICO's list of high-risk processing here will include the Art.37(1)(a) and (b) types! In which case, again, little difference in practice, but more flexibility.

No SRI sharing could cause practical problems given the difficulties with recruiting people with data protection expertise!

"Outsourcing" of SRI functions might perhaps still be possible as the SRI can alternatively "secure" that certain tasks are performed by another, taking into account expertise etc. Probably SRIs without sufficient privacy expertise (yet!) will have to secure another person (which doesn't seem limited to internal staff) to perform at least some tasks.
Transfers (data exports) New transitional provisions to "grandfather" valid transfer mechanisms in place before the relevant Bill provisions take effect. Comparing the transfers provisions generally, e.g. "not materially lower" vs "essentially equivalent", merits a note in itself, and will not be discussed here! Not discussed here. And it will be up to the European Commission to assess the extent to which these and other changes may affect UK adequacy!


But why didn't they do that?

While the following are points where the 2022 and 2023 versions of the Bill don't differ, some queries spring to mind:

  1. Research processing of special category/criminal-related data - under DPA2018 Sch.1 para.4, such processing is permitted if it's necessary for archiving purposes, scientific or historical research purposes or statistical purposes, is carried out in accordance with Article 89(1) [to be the new Art.84B i.e. safeguards], and is in the public interest. Here, the UK went beyond GDPR, because the "public interest" requirement doesn't appear in Art.9(2)(j). National law permitting such processing just has to be "proportionate to the aim pursued, respect the essence of the right to data protection and provide for suitable and specific measures to safeguard the fundamental rights and the interests of the data subject". Presumably it's a UK policy decision to require the "public interest" condition in addition? If so, giving examples or defining "public interest" here would be helpful as it's such a vague and broad term.

  2. AI bias and anti-discrimination - the June 2022 consultation response intended to expand the DPA2018 sch.1 para.8 exemption, allowing processing of special category data and criminal offence-related data for equality of opportunity or treatment, to permit bias monitoring, detection and correction in AI systems. Surely this is a laudable aim that no one should object to, so it's not clear why this update didn't make it into the Bill?

  3. PECR/cookies
    1. Security - the Bill will allow storage/access to ensure security of the terminal equipment, but why not security of networks/data more broadly given the critical importance of security generally?
    2. Analytics - the Bill would allow first party analytics, but it seems not the use of a third party analytics service, as sharing with third parties is allowed only to enable them to "assist with making improvements to the service or website" - why not also to enable them to assist with collecting that information? SMEs in particular won't have technical expertise to install their own on-prem inhouse analytics solutions, so not including "or collecting that information" there may undermine the legislative objective of easing web/mobile analytics for organisations. 

BTW, on DSARs' change from "manifestly unfounded or excessive" to "vexatious or excessive" - the latter phrase has been much discussed (including at regulatory and judicial level), and therefore is well understood in the UK, in the FOI (freedom of information) context. See also the discussion on this in LinkedIn, in the comments section

Interestingly, the first version of the press release said "Ministers have co-designed the Bill with key industry and privacy partners - including Which? and TechUK..." but the current press release no longer mentions Which?. Input from consumer organisations is obviously important in this context.

Will compliance with the EU GDPR really comply with the new Bill?

I spotted one minor example where strictly, it won't.

Privacy notices will have to include info about the right to complain to the controller, under the Bill. GDPR privacy notices needn't.

But, as per statements at the IAPP UK Intensive on 8 Mar 23, it's very unlikely that the ICO would fine or enforce against Cs lacking that one line (it'll just say, add that in)! And obviously including that extra info won't cause any issues under the EU GDPR.

Friday, 24 February 2023

Key points: EDPB transfers & territorial scope final guidance

We now have the final version of the EDPB's Guidelines 05/2021 on the Interplay between the application of Article 3 and the provisions on international transfers as per Chapter V of the GDPR

1. Generally, it makes useful clarifications to draft guidance, rather than substantive changes. There are 5 extra examples and new Annex with diagrams for all examples. New Exec Summary. Maria and George remain the same (not Alice or Bob!), but specific third-country names were removed.

2. Most clarifications aren’t surprising e.g. remote viewing/access of/to EEA-hosted personal data from outside EEA whether for support/admin etc. is a “transfer”, including by a processor; EEA platform passing personal data to non-EEA controller is making a “transfer” (“controller” seems a misnomer if the non-EEA entity isn’t subject to GDPR, but the platform is making a transfer whether it is or isn’t)

3. Helpful: controller disclosing personal data to EEA-incorporated processor (with non-EEA parent) – not a “transfer”. If processor discloses to third-country authority, it does so as independent controller. So controllers must assess circumstances for sufficient guarantees before engaging such processors.

4. Also helpful: 

  • when data subjects directly provide personal data to third country controller not subject to GDPR, that’s not a transfer
  • when data subjects directly provide personal data to third country controller that IS subject to GDPR under Art.3(2) offering/monitoring (added: “specifically targets the EU market”), that’s not a transfer but the controller must comply with GDPR (practical enforceability against it is a different issue of course)
  • when data subjects directly provide personal data to third country processor for third country controller, they don’t make transfers, but the controller “transfers” to the processor

5. Note: still not a transfer if EEA company employee travels to third country with laptop or remotely accesses EEA-hosted data – it’s within the same entity. New: if the employee in his capacity as such sends or makes available data to another entity in the third country, then that’s a transfer by the company.

6. Non-“transfers”:

  • New section on safeguards when  processing personal data outside the EEA even if technically there’s no “transfer”. Pay “particular attention” to the third country’s legal framework, as there may still be “increased risks” because “it takes place outside the EU, for example due to conflicting national laws or disproportionate government access in a third country”. These risks must be considered for compliance e.g. Art.5 principles, 24 controller responsibility, 32 security, 35 DPIA, 48 transfers not authorised under EU law: “a controller may very well conclude that extensive security measures are needed – or even that it would not be lawful – to conduct or proceed with a specific processing operation in a third country although there is no transfer situation.”
  • Privacy notices for non-transfers outside EEA!: when a controller intends to process personal data outside the EU (although no transfer takes place), this information should as a rule be provided to individuals as part of the controller’s transparency obligations, e.g. to ensure compliance with the principle of transparency and fairness, which also requires controllers to inform individuals of the risks in relation to the processing”. Non-binding, strictly…

7. Still unaddressed:

  • Not a “transfer” if it’s within the same legal entity, so e.g. EEA branch of US corp sending personal data to HQ isn't making a transfer, but an EEA subsidiary sending to US parent IS. Obviously the EEA branch would be subject to GDPR, with easy enforceability due to its EEA presence.
  • Art.3(1) can apply directly to non-EEA “established” entities e.g. in the Costeja case, but EDPB focuses mainly on 3(2), mentioning 3(1) only in relation to processors used by EEA-established controllers. Presumably direct provision of personal data by data subjects to Art.3(1) non-EEA controllers would also not be “transfers”, but the controller is caught by GDPR? (practical enforceability…?)
  • EEA subprocessor to non-EEA processor – analogy with processor-to-controller transmissions, this must be a “transfer”, but no SCCs exist to allow this… (workaround – adapt P2C SCCs, hey we tried our best!) 
  • The “conflicting laws” issue applies equally to EEA-established organizations that expand to third countries.  Remember SWIFT, where using its own US data center was a “transfer”? Presumably now that use alone is not “transfer”, but disclosure to third-country entities would be.

8. My speculations about possible new options for non-EEA controllers: 

  • will some non-EEA controllers just directly collect personal data from EEA data subjects now? They may still be subject to GDPR under Art.3(2) or even 3(1), but practical enforceability…
  • will some non-EEA groups set up non-EEA subsidiaries to operate branches in the EEA, that can send data “back” outside the EEA without making “transfers”? Of course, those subsidiaries are subject to GDPR, and their disclosure to non-EEA parents will be “onward transfers” that need SCCs etc, but that might be easier for some…

9. Puzzling: most of us share common views on what “made available” involves, but I didn’t follow “embedding a hard drive or submitting a password to a file” – what does that mean, how do they involve “making available” data?