Mastodon Kuan0

Sunday 14 April 2024

Wednesday 23 August 2023

Age assurance/verification technologies & privacy/data protection

Key ICO resources and UK info/standards on age checking/assurance & the Children's Code are below.

ICO work to date on children's privacy and age estimation/verification:

Also relevant:


Sunday 23 July 2023

Windows: try local LLMs easily

1. Download kobold.cpp.exe from https://github.com/LostRuins/koboldcpp/releases (I picked the latest version)

2. Download the GGML BIN file for the model(s) you want to use - you can get Llama2 models from https://huggingface.co/TheBloke/Llama-2-7B-Chat-GGML - check for which large language models/LLMs are compatible with Kobold, go to the Files tab to find and download the one(s) you want.


3. For command line avoiders, just doubleclick koboldcpp.exe. A command line interface window and GUI window open up



4. In the GUI window click Model, Browse, select one of the downloaded GGML BIN files then click Launch



5. In your default browser a new tab should open, if not just open a tab yourself and go to http://localhost:5001/ and prompt away!


6. The command line window stays open, with info on the input prompts, output, processing time etc. Just close it and the browser tab when done. All data stays local to your computer, inputs and outputs etc.

NB. you need a lot of RAM, especially for the bigger models.

Thanks to Autumn Skerritt's helpful blog (which also covers Mac & Linux and has other useful info) - I just added info on the GUI and other possible downloadable models I found.




Monday 13 March 2023

Data Protection & Digital Information (No.2) Bill - key changes from 2022 Bill No.1; GDPR comparisons

The UK Data Protection & Digital Information (No.2) Bill's key changes from the 2022 Bill,  compared with the EU GDPR, are summarised in the table below. 

After the table are some "But why didn't they do that?" questions, and "Will compliance with the EU GDPR really comply with the new Bill"?

Table of Key Changes

  • Only changes from the 2022 version are covered, and only those relating to GDPR (not law enforcement or intelligence services or the DVS trust framework).
  • Clarifications/typos/minor corrections and other minor textual changes are not covered.
  • The table below is also not a full comparison of the entire Bill against the EU GDPR.

Abbreviations

ADM
automated
decision-making
C
controller
ICO
UK Information
Commissioner's Office
P
processor
PD
personal data
S
UK Secretary of State
SRI
senior responsible individual

Issue Cf 2022 version Cf EU GDPR Comments/Queries
Personal data Tighter, as this specifically calls out the role of access protection measures.
It’s PD if C/P knows/ought reasonably to know another person obtains/is likely to obtain info as result of C/P processing and the individual is identifiable/likely to be identifiable by that person at the time of processing, (added) including if an unauthorised person obtains info due to the C/P not implementing appropriate measures to mitigate the risk of their obtaining the info.
Clarifies: identifiability  is assessed at the time of processing by C/P.

Focuses on whether info is PD in the hands of whoever processes it (similar to the position under DPA 1998).
Time of processing - time of whose processing, processing by C, P, either, the other person?

If an individual is identifiable to C but not P, or vice versa, does that make them identifiable to both?

Why not also mention measures to mitigate the risk of unauthorised persons identifying individuals (e.g. strong encryption), vs. their obtaining the info? Surely such measures are equally important: focus on either/or, not just “obtaining”?
Legitimate interests New Art.6(9) gives examples of types of processing that may be necessary for LI:
- Necessary for direct marketing (defined in both versions as communication (by whatever means) of advertising or marketing material which is directed to particular individuals, and now also to be inserted into Art.4(1)(15A) UK GDPR),
- Intragroup transmission necessary for internal admin, or
- Necessary for security of network and info systems
Much has been made of this. But actually it’s just based on GDPR Recs.47 last sentence, 48 & 49, putting them into the operative text. Just without the “strictly necessary”, which in my view is very tight particularly in relation to ensuring security.

However, "direct marketing" is defined more broadly than in say the European Commission and Council's approach in the draft ePrivacy Regulation - could it include targeted advertising on websites or mobile apps here? 
Pity that necessity for preventing fraud Rec.47 wasn’t included, or necessity for the security of PD (not just systems).

The scope of "direct marketing" would benefit from clarification, e.g. is "sent" intended or is displaying personalised ads on web/mobile enough to be "direct marketing"?


Scientific research Clarified:
- Even commercial activity can be scientific research
But activities only qualify if they can “reasonably described as scientific”
GDPR doesn’t define scientific research. The Bill just provides helpful clarifications, e.g. drawing on Rec.159 (GDPR doesn’t explicitly exclude commercial research and Art.89 of course requires safeguards there, which the Bill is changing). Processing PD for studies in the area of public health are “scientific” only if conducted in the “public interest” – clarify “public interest” here? But generally that phrase isn’t defined anywhere… and see queries after this table.
Statistical purposes Includes processing for statistical surveys or production of statistical results resulting in aggregate non-personal data, but (added) only if controller doesn’t use personal data processed or resulting information to support measures/ decisions regarding a particular data subject to whom the personal data relates Just clarifications, reflecting Rec.162 -
ADM Art.22A(2) no longer states that decisions include profiling.  I consider this to now reflect the correct interpretation, rather than a relaxation - see the next cell.

Instead, when considering whether there's meaningful human involvement, the extent to which the decision was reached by profiling must be considered among other things. That's one way to interpret the profiling reference in Art.22 and it makes some sense.

S may make regulations stipulating that certain cases do, or don't, have meaningful human involvement.
Clarifies the debated issue of whether Art.22 only gives rights to data subjects to object to ADM, or positively prohibits ADM.

Clarifies that decisions “based solely on automated processing” are those with “no meaningful human involvement”.

Clarifies role of profiling, in the debate on whether Art.22 catches profiling per se, or only profiling that leads to ADM (I believe the latter). So, Art.22A(2) now reflects what I feel is the correct interpretation.

A positive prohibition usefully clarifies the position. Similarly with the meaning of automated decisions.

Data subjects aren't deprived of rights regarding ADM, because the new Art.22C safeguards must enable data subjects to obtain human intervention and to contest decisions, and individuals can no doubt claim compensation for breach of this explicit prohibition. 

However, it's unclear why Sch.4 will omit s.14 DPA2018 altogether. Removing the notification requirement may reduce burdens on Cs, but retaining a positive obligation on Cs to consider requests to reconsider decisions could further help to show that data subjects do retain their ADM rights. Perhaps S regulations are intended to address this and other ADM-related issues?
ROPAs (records of processing activities) Needed only for processing which, taking into account its nature, scope, context and purposes, is likely to result in a high risk to the rights and freedoms of individual - instead of 2022 exemption for <250 employees unless likely to result in high risk

C records need include only categories of person with whom C shares PD, rather than named persons. However, there "recipients" has been changed to "persons" in third countries/ international organisations.

Amends Art.57(1)(k) to require the ICO to produce and publish a document containing examples of types of processing which it considers are likely to result in a high risk to the rights and freedoms of individuals (for the purposes of Articles 27A, 30A and 35) - i.e., senior responsible individualROPAs and assessment of high-risk processing. This helps ensure a consistent view of what is considered "high-risk" across these different areas.
Required for all Cs and Ps with exemption for <250 employees unless processing is likely to result in a risk to rights and freedoms of data subjects, is not occasional, or the includes special category or criminal-related data.

Changing "recipients" to "persons" actually goes broader than GDPR, as under GDPR Art.4(9) certain public authorities (again it's not entirely clear which) aren't considered "recipients", so this should be positive for UK adequacy as any sharing with public authorities must definitely be recorded.
Arguably, catching C/Ps even with <250 employees for high-risk processing would catch non-occcasional processing of special category or criminal-related data.

While it's "high-risk" vs. "a risk", the latter catches most C/Ps; some might it's say is too strict given realistic risks, especially under EDPB's broad interpretation of Art.30.5's "or". So the Bill is less strict than GDPR, but hopefully that's not significant enough to prejudice UK adequacy.

It's odd that the "categories" issue relates to C records (Cs will surely know those they share PD with), rather than DSARs/privacy notices - could the change have been intended for the latter, but inadvertently got inserted here instead?
DPIAs (assessment of high-risk processing) Deleted ICO's Art.35(4)-(5) obligation to publish list of operations requiring DPIA and power to publish list of operations not requiring assessment.

But, see above on the amended Art.57(1)(k) which effectively does the same thing, except that there's no longer power to publish lists of operations not requiring assessment.
No explicit requirement to consult DPO. However, arguably this is implicit in new Art.27B(2)(c), informing/advising of data protection obligations.

No Art.35(3) criteria deeming certain types of processing always to be high risk (ADM, large-scale processing of special category/criminal-related data and large-scale systematic monitoring of publicly accessible areas!)

The related Art.36 makes prior consultation with ICO optional, but see LinkedIn discussion in comments on whether this makes much difference in practice.

The legislative aim to require assessments for high-risk processing remains, in substance.

I suspect the ICO's list of high-risk processing will include the Art.35(3) types! In which case, little difference in practice, but more flexibility.

Oddly, there's no explicit power for the ICO to publish lists of activities that are not considered to require assessment as high-risk.

Senior responsible individual To be designated by public body or likely high-risk processing, but note the amended Art.57(1)(k) regarding an ICO list to be published of what's high-risk processing for this purpose.

(The ICO's "high-risk" lists could theoretically be different for SRI, high-risk assessments and ROPA purposes, but they may not be - consistency will be helpful here.)
No more Art.37(1)(b)-(c) criteria deeming certain types of processing always to require a DPO (core activities involve large-scale regular and systematic monitoring or processing special category/criminal-related data).

The individual must be part of the organisation’s senior management which arguably goes beyond GDPR. Allowing job-sharing here is enlightened. SRI details must be notified to the ICO.

However, there's no longer any "sharing" allowed ot the SRI across different public authorities or a related group. 
Given the SRI must be designated in high-risk processing situations, and issues like resourcing and conflicts are clearly covered, is there much difference in practice?

Again, I suspect the ICO's list of high-risk processing here will include the Art.37(1)(a) and (b) types! In which case, again, little difference in practice, but more flexibility.

No SRI sharing could cause practical problems given the difficulties with recruiting people with data protection expertise!

"Outsourcing" of SRI functions might perhaps still be possible as the SRI can alternatively "secure" that certain tasks are performed by another, taking into account expertise etc. Probably SRIs without sufficient privacy expertise (yet!) will have to secure another person (which doesn't seem limited to internal staff) to perform at least some tasks.
Transfers (data exports) New transitional provisions to "grandfather" valid transfer mechanisms in place before the relevant Bill provisions take effect. Comparing the transfers provisions generally, e.g. "not materially lower" vs "essentially equivalent", merits a note in itself, and will not be discussed here! Not discussed here. And it will be up to the European Commission to assess the extent to which these and other changes may affect UK adequacy!


But why didn't they do that?

While the following are points where the 2022 and 2023 versions of the Bill don't differ, some queries spring to mind:

  1. Research processing of special category/criminal-related data - under DPA2018 Sch.1 para.4, such processing is permitted if it's necessary for archiving purposes, scientific or historical research purposes or statistical purposes, is carried out in accordance with Article 89(1) [to be the new Art.84B i.e. safeguards], and is in the public interest. Here, the UK went beyond GDPR, because the "public interest" requirement doesn't appear in Art.9(2)(j). National law permitting such processing just has to be "proportionate to the aim pursued, respect the essence of the right to data protection and provide for suitable and specific measures to safeguard the fundamental rights and the interests of the data subject". Presumably it's a UK policy decision to require the "public interest" condition in addition? If so, giving examples or defining "public interest" here would be helpful as it's such a vague and broad term.

  2. AI bias and anti-discrimination - the June 2022 consultation response intended to expand the DPA2018 sch.1 para.8 exemption, allowing processing of special category data and criminal offence-related data for equality of opportunity or treatment, to permit bias monitoring, detection and correction in AI systems. Surely this is a laudable aim that no one should object to, so it's not clear why this update didn't make it into the Bill?

  3. PECR/cookies
    1. Security - the Bill will allow storage/access to ensure security of the terminal equipment, but why not security of networks/data more broadly given the critical importance of security generally?
    2. Analytics - the Bill would allow first party analytics, but it seems not the use of a third party analytics service, as sharing with third parties is allowed only to enable them to "assist with making improvements to the service or website" - why not also to enable them to assist with collecting that information? SMEs in particular won't have technical expertise to install their own on-prem inhouse analytics solutions, so not including "or collecting that information" there may undermine the legislative objective of easing web/mobile analytics for organisations. 

BTW, on DSARs' change from "manifestly unfounded or excessive" to "vexatious or excessive" - the latter phrase has been much discussed (including at regulatory and judicial level), and therefore is well understood in the UK, in the FOI (freedom of information) context. See also the discussion on this in LinkedIn, in the comments section

Interestingly, the first version of the press release said "Ministers have co-designed the Bill with key industry and privacy partners - including Which? and TechUK..." but the current press release no longer mentions Which?. Input from consumer organisations is obviously important in this context.

Will compliance with the EU GDPR really comply with the new Bill?

I spotted one minor example where strictly, it won't.

Privacy notices will have to include info about the right to complain to the controller, under the Bill. GDPR privacy notices needn't.

But, as per statements at the IAPP UK Intensive on 8 Mar 23, it's very unlikely that the ICO would fine or enforce against Cs lacking that one line (it'll just say, add that in)! And obviously including that extra info won't cause any issues under the EU GDPR.

Friday 24 February 2023

Key points: EDPB transfers & territorial scope final guidance

We now have the final version of the EDPB's Guidelines 05/2021 on the Interplay between the application of Article 3 and the provisions on international transfers as per Chapter V of the GDPR

1. Generally, it makes useful clarifications to draft guidance, rather than substantive changes. There are 5 extra examples and new Annex with diagrams for all examples. New Exec Summary. Maria and George remain the same (not Alice or Bob!), but specific third-country names were removed.

2. Most clarifications aren’t surprising e.g. remote viewing/access of/to EEA-hosted personal data from outside EEA whether for support/admin etc. is a “transfer”, including by a processor; EEA platform passing personal data to non-EEA controller is making a “transfer” (“controller” seems a misnomer if the non-EEA entity isn’t subject to GDPR, but the platform is making a transfer whether it is or isn’t)

3. Helpful: controller disclosing personal data to EEA-incorporated processor (with non-EEA parent) – not a “transfer”. If processor discloses to third-country authority, it does so as independent controller. So controllers must assess circumstances for sufficient guarantees before engaging such processors.

4. Also helpful: 

  • when data subjects directly provide personal data to third country controller not subject to GDPR, that’s not a transfer
  • when data subjects directly provide personal data to third country controller that IS subject to GDPR under Art.3(2) offering/monitoring (added: “specifically targets the EU market”), that’s not a transfer but the controller must comply with GDPR (practical enforceability against it is a different issue of course)
  • when data subjects directly provide personal data to third country processor for third country controller, they don’t make transfers, but the controller “transfers” to the processor

5. Note: still not a transfer if EEA company employee travels to third country with laptop or remotely accesses EEA-hosted data – it’s within the same entity. New: if the employee in his capacity as such sends or makes available data to another entity in the third country, then that’s a transfer by the company.

6. Non-“transfers”:

  • New section on safeguards when  processing personal data outside the EEA even if technically there’s no “transfer”. Pay “particular attention” to the third country’s legal framework, as there may still be “increased risks” because “it takes place outside the EU, for example due to conflicting national laws or disproportionate government access in a third country”. These risks must be considered for compliance e.g. Art.5 principles, 24 controller responsibility, 32 security, 35 DPIA, 48 transfers not authorised under EU law: “a controller may very well conclude that extensive security measures are needed – or even that it would not be lawful – to conduct or proceed with a specific processing operation in a third country although there is no transfer situation.”
  • Privacy notices for non-transfers outside EEA!: when a controller intends to process personal data outside the EU (although no transfer takes place), this information should as a rule be provided to individuals as part of the controller’s transparency obligations, e.g. to ensure compliance with the principle of transparency and fairness, which also requires controllers to inform individuals of the risks in relation to the processing”. Non-binding, strictly…

7. Still unaddressed:

  • Not a “transfer” if it’s within the same legal entity, so e.g. EEA branch of US corp sending personal data to HQ isn't making a transfer, but an EEA subsidiary sending to US parent IS. Obviously the EEA branch would be subject to GDPR, with easy enforceability due to its EEA presence.
  • Art.3(1) can apply directly to non-EEA “established” entities e.g. in the Costeja case, but EDPB focuses mainly on 3(2), mentioning 3(1) only in relation to processors used by EEA-established controllers. Presumably direct provision of personal data by data subjects to Art.3(1) non-EEA controllers would also not be “transfers”, but the controller is caught by GDPR? (practical enforceability…?)
  • EEA subprocessor to non-EEA processor – analogy with processor-to-controller transmissions, this must be a “transfer”, but no SCCs exist to allow this… (workaround – adapt P2C SCCs, hey we tried our best!) 
  • The “conflicting laws” issue applies equally to EEA-established organizations that expand to third countries.  Remember SWIFT, where using its own US data center was a “transfer”? Presumably now that use alone is not “transfer”, but disclosure to third-country entities would be.

8. My speculations about possible new options for non-EEA controllers: 

  • will some non-EEA controllers just directly collect personal data from EEA data subjects now? They may still be subject to GDPR under Art.3(2) or even 3(1), but practical enforceability…
  • will some non-EEA groups set up non-EEA subsidiaries to operate branches in the EEA, that can send data “back” outside the EEA without making “transfers”? Of course, those subsidiaries are subject to GDPR, and their disclosure to non-EEA parents will be “onward transfers” that need SCCs etc, but that might be easier for some…

9. Puzzling: most of us share common views on what “made available” involves, but I didn’t follow “embedding a hard drive or submitting a password to a file” – what does that mean, how do they involve “making available” data?

Sunday 16 October 2022

Automated Decision Making (ADM) & GDPR - Flowchart

ADM under GDPR - I produced this flowchart after noticing that my Imperial AI MSc students were struggling to parse Art.22. Admittedly it's been termed the worst-drafted of all GDPR provisions, rightly, by someone I used to work with, who knows who she is :) I hope it will be useful, and as always all comments are welcome!

Saturday 9 July 2022

UK NIS Regulations: enforcement, & future

For both OESs and DSPs the UK NIS Regulations have barely been enforced, but change is coming,  including to bring MSPs within scope. (OESs are operators of essential services, basically critical infrastructure service providers, while DSPs are "digital service providers": cloud computing service providers, online marketplaces or online search engines only, not other providers of digital services in the broad sense). 

The Second Post-Implementation Review of the Network and Information Systems Regulations 2018 (PDF), 4 July 2022, revealed this and other interesting information:

  1. NIS incident reporting hasn't actually been happening: “…the system does not appear to be working. As of this review, competent authorities have received little-to-no reports, despite other sources of information, such as the Breaches Survey, indicating a prevalence of incidents within the wider economy and society.”

  2. NIS enforcement has been minimal; no NIS fines (penalty notices) have been imposed so far: 
    1. Only 2 competent authorities have enforced to date, "which raises the question of "is the enforcement regime appropriate?" But, “NCSC has also been informed of one very successful instance of a competent authority carrying out enforcement, which had very positive outcomes, suggesting that the enforcement regime may be appropriate." 
      1. Note: it's unclear if the UK ICO, which regulates DSPs under the NIS Regulations, was one of thise two authorities.
    2. “…there is evidence from competent authorities to suggest that there are cases where enforcement activities were merited but no action was taken. The use of enforcement tools overall, is much lower than the reported need and so far competent authorities appear to have been less inclined to make use of their regulatory powers." Why, and why not? The reasons are not stated.
    3. "There is also a reported concern from regulators that the grounds for enforcement (either via enforcement notices or penalty notices) is not clear enough”…
    4. “NIS competent authorities... have additionally reported being very restrictive with their regulatory powers, relying more on regular engagements, inspections, and information notices rather than any binding provisions of the regulations, such as enforcement notices, civil proceedings, or penalty notices.”
    5. "Of those who felt the enforcement regime wasn't proportionate, 44% gave other reasons including there is no clear link between the fine levied and the actions that operator of essential services took prior to the incident and the fact that fines result in double jeopardy as there is already a cost relating to a cyber breach."
      1. Note: it's interesting that the double jeopardy cited was not the possibility of fines under both GDPR and NIS, which is the key double jeopardy risk in my view (to be addressed in the EU's NIS 2 Directive). The breach costs point is, of course, also relevant to GDPR fines too, but cited only sometimes (in conjunction with remediation costs) in GDPR supervisory authority decisions.
    6. The only relevant DSPs who indicated the enforcement regime was not proportionate to the risk of disruption reported feeling that the Regulations were incorrectly applied to DSP organisations in general. (This I agree with, see later below.)
    7. DCMS will aim to collect annual data from the competent authorities e.g. the number of incidents per year, the number of independent audits of the Cyber Assessment Framework, the number of improvement plans as a result of the Cyber Assessment Framework, the number of information notices issued by the competent authorities, the number and nature of enforcement notices issued by competent authorities, and the number of organisations regulated by sector and also the number of SMEs regulated by sector.

  3. NIS Regs' Cyber Assessment Framework: this has allowed experts in competent authorities to review organisations' cyber security arrangements and ensure improvements are made. 67 known operators have received improvement plans (including updating legacy systems and software to reduce vulnerabilities), highlighting Regulations' role in improving cyber security. 
    1. Note: the reference was only to "operators". This suggests no DSPs were asked to make any improvements to their cybersecurity under NIS.

  4. NIS Regs generally: effective to drive good cyber security behaviours; "...strong indication that without NIS, cyber security improvements across essential services in the UK would proceed at a much slower pace. ...added benefit of covering a large number of sectors, which is expected to address some of the inconsistencies of managing risks to networks and information systems across sectors...". But, areas of improvement remain, thought to be most appropriately tackled through regulatory intervention, to strengthen and future-proof the regulatory framework.
    1. Other regulations or standards mentioned as drivers for improvements in cyber security included: UK General Data Protection Regulations (GDPR) (13 or 86% of relevant digital service providers, 68 or 78% of operators of essential services); ISO27001 (28% of operators of essential services); Cyber Essentials and Cyber Essentials Plus (11% of operators of essential services); as well as other industry standards (33% of operators of essential services).

  5. Areas needing improvement, and future plans: Then-Minister Lopez's associated statement to Parliament on 4 July noted that recommended changes to the NIS Regs were included in the Department for Digital, Culture, Media & Sport's Jan 2022 consultation, Proposal for legislation to improve the UK’s cyber resilience (summarised in my Linkedin post). The outcome of that consultation is to be published "later this year", i.e. later in 2022. Recent UK political events, including her resignation on 6 July, may of course result in delays to the initially-planned timescale. The key areas are:

    1. DSP registration and guidance: 54% of responding DSPs stated it was not easy to identify that their organisations are in scope (this deters registration, and ICO won't be aware of their activities to advise them!).
      1. "Further work is required to ensure that the guidance makes it easy to identify whether firms are in or out of scope of the Regulations and to ensure that organisations that need to be included in the regulations are designated."
      2. "Registration of digital service providers cannot be left to digital service providers alone... The Government will continue to support the ICO in the work it is already carrying out to identify firms that should be under the Regulations and support them in notifying those organisations of their responsibilities. Both the government and the Information Commissioner, should consider ways to increase awareness of the NIS Regulations with all potential digital service providers." The government should consider options to provide the Information Commissioner with increased information-seeking powers (similar to existing ones available to competent authorities of operators of essential services) to ascertain whether an organisation qualifies as a relevant DSP under the NIS Regulations.

    2. Ensuring the right sectors are caught: managed service providers (MSPs) are not caught currently, but under the Jan 22 consultation they will be. (For other subsectors discussed e.g. BPO, SIEM, analytics & AI, see my Linkedin post, but it seems "While this Post-Implementation Review has not identified any other sectors that need to be included at this time, it has underlined a need for the government to maintain the powers to make such additions in the future.")

    3. Supply chain security: OESs can't monitor supply chains due to lack of supplier cooperation and lack of resources. Action is needed to increase operators’ ability to manage security risks arising from supply chains, particularly suppliers critical to provision of essential services.
      1. Proposed power to designate critical dependencies to identify, impose duties, and then regulate certain supply chain organisations that present systemic risks to OESs, due to their market concentration, reliance on those services, or other factors.
        1. Comment: could IaaS/PaaS, perhaps even some SaaS providers, be caught both as DSP and as critical dependency? - highest common denominator of compliance required there. Also, could IaaS/PaaS providers that are critical enough, simply be designated as OESs themselves (legislative rules permitting)?
      2. DCMS will consider options such as amending guidance to tackle supply chain security concerns, including using standards and certification, such as Cyber Essentials and Cyber Essentials +, to address this issue. But cross-government consultation is needed.
      3. Note: see also the Government response to the call for views on supply chain cyber security, Nov 2021.

    4. Capability & capacity of OESs, DSPS, competent authorities: lack of finance/funding or of general resources, more variable among authorities particularly lack of cyber regulator specific training or centralised NIS training (as opposed to GDPR training). Competent authorities also need more resources for effective enforcement. On authorities' resources:
      1. DCMS will "commit to persuading those departments to ensure that they meet their legal obligations to fund their NIS oversight. For these, plus those regulators that are not central government departments, DCMS aims to ensure that competent authorities are able to recover the costs of regulation from those being regulated, in line with government policy."
      2. Additional ways to improve resource-efficiency will be considered, e.g. promoting collaboration across authorities and with non-NIS authorities such as banking and financial services regulators (for designation of critical dependencies), exploring existing frameworks like CBEST and TBEST to test assumptions and highlight areas for further development.

    5. Incident reporting: thresholds (in statutory guidance) are too high, and base criteria of a reportable incident is too narrow (disruption to the service, cf. impact on NIS) to capture the most high risk incidents risks. To ensure that the right incidents are captured:
      1. Authorities should review reporting thresholds and lower if necessary.
      2. OESs and DSPs will be required to report all incidents that have a material impact on the confidentiality, integrity, and availability of NIS [note: the well known CIA triad], and [note: I think "or" is intended here?] that have a potential impact on service continuity.

    6. Enforcement: DCMS needs to conduct work to assess why the enforcement regime is not being utilised where it is merited.

    7. Consistency and more robust oversight: greater consistency in regulatory implementation across sectors is required, alongside creation of performance metrics to better measure the impact and effectiveness of the Regulations.
      1. DCMS should issue revised and updated guidance to competent authorities, setting out the requirement for a common approach to assessment and performance indicators; explore ways to make such guidance more binding on authorities; and establish a process by which competent authorities report against performance indicators and are held accountable for their performance (indicators could be linked to the delivery of the National Cyber Strategy and its performance framework). 
Note also the related consultation on Data storage and processing infrastructure security and resilience - call for views (press release), including data centre infrastructure, cloud platform infrastructure and MSP infrastructure, which expires at the end of Sunday 24 July 2022.

The next UK NIS Regulations review isn't due for another 5 years.

Comments

Below are my personal views only, but they're based on my practical experience of advising clients on the UK NIS Regulations and EU NIS Directive: both their legal and technical/security teams.
  • Incident reporting:
    • "There is a lot of uncertainty around the incident response, and which incidents need to be reported...". In my view, this uncertainty is a contributing factor, and guidance is sorely needed, alongside the planned steps mentioned above regarding lowering reporting thresholds and requiring reporting of incidents materially affecting NIS CIA even if not affecting the service.
    • However, there's a risk of a tsunami of reports that regulators may not be able to cope with, if every incident "materially" impacting C, I or A has to be notified. It's important to bear this factor in mind when setting the reporting test/thresholds. Again, guidance on "materiality" will be vital.
  • Awareness, scope, DSPs and non-registration: I hope the government will take the opportunity, post-Brexit, to reconsider the scope of the NIS Regulations beyond just bringing MSPs into scope. In particular, please consider whether and to what extent SaaS providers should be caught by the NIS Regulations.
    • The NIS Regulations were binding from 10 May 2018. Guess what else there was in May 2018? Yep, the GDPR. No surprises then that most organisations focused their resources on GDPR rather than NIS compliance, especially with the huge publicity about GDPR fines and hardly anything being said about NIS.
    • It's understandable that IaaS/PaaS providers should be subject to the Regulations as DSPs, because many organisations build their own technology infrastructure or customer-facing services on top of those cloud services. I.e., many organisations create their own SaaS services based on third party IaaS/PaaS services, which do constitute technology infrastructure-type services.
    • However, automatically and unthinkingly copying out the NIST definition of cloud computing is not the right approach here. Applying NIS laws to SaaS is like applying certain laws to "all websites" when they should actually apply to "website hosting platforms/services". SaaS involves the provision of specific applications or services to end users (like a word processing application online, instead of via an application installed on a local computer). Those applications/services can vary hugely in their scope and purpose. The applicability of NIS requirements ought to depend on the specific type of application/service and its importance to the economy or society (e.g. is the service critical to the provision of an OES's essential service?) - and not just because of its general nature as SaaS. Currently, all SaaS services are technically caught, whether they're used for bill payments or as a forum for pet lovers to discuss their animals. To me, that doesn't seem to make sense.
    • As I've previously pointed out, SaaS providers don't always register with the ICO for various reasons.
      • Registering puts their heads firmly above the parapet for possible enforcement. Especially as, since Jan 2021, the top £17m tier of fines could be imposed based on serious service outages alone, whereas previously the top tier only applied if the service was important to the economy. If I provided a SaaS service for pet lovers' discussions, which no one could think would harm the economy or society if it went down, I wouldn't want to register and make my service known to the ICO either.
      • Saying that SaaS services are caught "only to the extent that they provide a scalable and elastic pool of resources to the customer" just parrots the definition without providing any useful guidance. All cloud services are, by definition, meant to be scalable and elastic. They're not infinitely scalable or elastic, of course; even IaaS/PaaS services impose practical commercial limits on customers' usage, so SaaS services' lack of infinite scalability/elasticity should be a non-point too. But some SaaS providers do argue they're not caught because their service doesn't enable access to a "scalable" and "flexible" pool of shareable computing resources. I have some sympathy here, not because the services really aren't scalable/flexible, but because (as above), given the legislative objective of NIS laws, I feel that it's simply not sensible to try to catch all SaaS services just because they're SaaS, regardless of the exact nature of their services or customers served. Business models are increasingly moving to SaaS, away from software licensing: but there's no legal requirement to have security measures or report vulnerabilities or security issues affecting all software applications regardless of their nature (although many might think that would be sensible). And I've always thought it odd that flexible/scalable services are subject to NIS, when inflexible, non-scalable "classic" hosting platforms are not, even though with the latter their customers are more at risk from availability issues (due to their inflexibility and non-scalability!). Surely it should be the other way round?
      • And making all SaaS services register is akin to making all software application manufacturers/distributors register their software. The ICO receives fees from controllers who register for data protection purposes, so there's a benefit to the ICO from that registration. But is the benefit of finding out about all online software applications of whatever type or importance worth the administration and other costs?
      • Would introducing a fine for non-registration help? I don't think so, because of the underlying issue I've emphasised regarding the inappropriateness and disproportionality of bringing all SaaS services within scope regardless of their importance to society or the economy (and see later below).
      • In my experience, SaaS providers may register if they provide important services to operators. Otherwise, they tend to keep their heads down, and I don't blame them.
      • The lack of publicly-reported enforcement of the Regulations is another reason for relative lack of awareness of NIS. 
  • Capability and enforcement
    • Certainly as regards DSPs, I've found that many ICO staff aren't familiar with NIS and need NIS training as well as more resources for NIS, e.g. those staffing the helpline number given on the ICO's NIS webpage. As flagged above, some DSPs consider the Regulations were incorrectly applied to DSPs in general, and I agree, possibly because of awareness  and/or knowledge issues.
    • The reluctance of many SaaS providers to register, never mind report incidents, is fuelled by the factors I've outlined above, and fear of being subject to the maximum possible fine even though their service may be of minor importance to society or the economy. If they have to bear the costs of ICO investigations too, as is planned, that may drive even more SaaS providers to decide not to register. 
    • The bigger risks for non-registering DSPs are monetary penalties for not reporting incidents when they should have, and/or not having the appropriate security measures in place. If they haven't registered and haven't notified incidents, that of course reduces those risks, because the ICO won't know about them! The main risk then is if they report a personal data breach under GDPR and the ICO says, "Aha! We will fine you under NIS too, because you should have reported the incident under NIS!". But, this depends on the ICO's NIS and GDPR enforcement divisions being sufficiently joined up and also trained up (again, the skills/knowledge issue flagged earlier).
  • Summary: personally, I would recommend:
    • Reconsidering the extent to which SaaS providers should be in scope under NIS, if at all.  For example, consider introducing specific thresholds or criteria for SaaS providers to be in scope (Obviously if they are critical suppliers to OESs, or OESs themselves, they should be caught under those proposed changes and be exposed to possible designation as OESs, but that's a separate matter.)
    • Reconsidering the extent to which SaaS providers should be subject to the different tiers of NIS monetary penalties or other enforcement, if at all (with the same caveat). Again, consider if different types/tiers of fines or other enforcement should be applicable to SaaS providers or indeed DSPs that aren't OESs or critical suppliers.
    • These would help save the ICO's resources too, so they can be directed towards IaaS/PaaS and truly important SaaS providers.
    • If less radical changes are to be made, provide much clearer guidance on if/when SaaS providers will be caught by the Regulations and therefore need to register with the ICO.
    • Making publicly available the annual data DCMS aims to collect from regulators, particularly enforcement information and levels of fines imposed. This would help to raise awareness and incentivise compliance.
    • Requiring the ICO and other regulators to publish the full text of their NIS enforcement and monetary penalty etc notices, but redacted as necessary (including as to OES/DSP names), ideally also listing and linking to them on a centrally-maintained webpage of NIS enforcement action. That would also help raise awareness and incentivise compliance.