Mastodon Kuan0: 2020

Sunday 23 August 2020

Chrome highlights text on webpages: how to disable

This blog explains how to stop Chrome's highlighting of text on some webpages you visit after clicking on Google search results snippets. 

The problem: Chrome now scrolls directly to the highlighted text ("text fragments", meant to reflect your search terms) on the webpage in question. It also mangles the URL in the browser so that the web address has, appended to it:

For anyone who's managed to escape this new feature, here's a direct example. (Apparently it also does this in Apple's Safari.) Not everyone wants that behaviour in their browser and can even find it annoying and unhelpful. This new Chrome feature was introduced by Google in early June 2020. Websites can opt out, but it's much more difficult for Chrome end users to disable it.

To prevent this happening via Chrome flags is no longer possible. Most people may not be able (or want) to set enterprise policies or mess with their registry (which even stopped Chrome working for one person), or install the Redirector extension with more fiddling.

So, here's my own relatively easy fix, which you can use to change the webpage back to what it should be, rather than preventing or getting rid of the new feature. My solution to this issue:

  • "Reverts" you to the webpage you were trying to view, without highlighting the text fragment or scrolling to it.
  • Cleans up the URL in the address bar too, removing the # and all the stuff after it.
  • (Optional - even copies the "clean" URL to your clipboard for easy sharing.)
It involves setting up a new bookmarklet or favelet with some Javascript, that you can just click (or use a hotkey to access), to sort out the issue quickly. If that description fazes non-coders, not to worry, here's a very simple step by step:


Solution to remove unwanted text highlighting

  1. Make your bookmarks bar visible in Chrome if it's not already (click top right up arrow > Bookmarks > Show Bookmarks, or press Ctrl-Shift-b).
  2. Bookmark any webpage you like (e.g. Ctrl-d and Enter), but drag it so it's visible in the bar.
  3. Rightclick the new bookmark in the bar, select Edit.
  4. In the Name box, change it to e.g. Cleanup, or even just 1 (I'll explain the latter later), ideally starting with a letter which your existing bookmarks don't start with.
  5. In the URL box, clear what's there, and copy and paste the following text in there instead, exactly as is (don't add spaces etc.), then click Save:
    javascript:var url=window.location.href; cleanurl=url.split('#')[0]; window.location.replace(cleanurl); 
  6. In future, if you find yourself on a webpage with highlighted text fragments and the long URL after clicking on Google search results, to clean it up just:
    1. Click that new bookmark in the bookmarks bar, or
    2. (For those who like keyboard shortcuts) Press Alt-e then b then 1 (or whatever was the first letter of the new bookmark's name) then, if necessary, Enter - which has the same effect as clicking it.
  7. Optional: if you also want to be able to copy the clean URL automatically to your clipboard for pasting into an email etc. then, in step 5 above, instead of pasting what was shown there just paste the following exactly as is:
    javascript:var url=window.location.href; cleanurl=url.split('#')[0]; var input=document.body.appendChild(document.createElement("input")); input.value=cleanurl; input.focus();;   document.execCommand('copy'); input.parentNode.removeChild(input); window.location.replace(cleanurl); 
I hope that's helpful.

Monday 17 August 2020

Schrems II additional safeguards: confidential computing

The highest EU court (CJEU) in the Schrems II ruling said that standard contractual clauses (SCCs) can, in principle, be used to legitimise transfers of personal data outside the EU/EEA, provided "additonal safeguards" are implemented where appropriate (or "supplementary measures", as the European Data Protection Board or EDPB has called them).

I previously blogged about providing additional safeguards through encryption. And indeed Amazon, regarding its AWS cloud service and Privacy Shield's invalidation, emphasised its "technical and physical controls designed to prevent unauthorized access or disclosure of customer and partner content", and "advanced encryption and key management service".

I also noted that data could be encrypted in storage and in transmission ("at rest" and "in transit"), but there were difficulties with operating on encrypted data, although work was proceeding on areas such as homorphic encryption.

I just wanted to expand on that to point out that, in fact, it is a reality today that secure operations on data are possible in practice - without the third party cloud provider or other service provider being able to "spy" on the computing operations or access intelligible data that they could then give to third country intelligence or security agencies.

The main development here is "confidential computing", as it's become known as. This involves protecting data in use, within a "trusted execution environment" (TEE) which safeguards the data from outside viewing or interference. TEEs, or enclaves as they're also termed, can be implemented via hardware, e.g. Intel's SGX (Software Guard Extension) which seeks to protect areas of memory running the relevant application code on the relevant data, or via software. Edited: to clarify, yes, this isn't really working on encrypted data, it's working on unencrypted or decrypted data, but only when (effectively) it's within a secure hardware "box" that the cloud or other service provider can't peek into. This means they can't see what the data in the box is, or what operations are being conducted on that data, so they can't spy on the data or processing or tell any authorities what it is.

What's exciting is that the Confidential Computing Consortium, spearheaded by the Linux Foundation, was formed just last year, in Oct 2019, with members including Alibaba Cloud, Arm, Google, Huawei, Intel, Microsoft and Red Hat and also Baidu, Bytedance, Fortanix, Oasis Labs, Oracle, Swisscom, Tencent and VMWare. By the end of June 2020, it had been joined by companies such as Accenture, AMD, Facebook, NVIDIA, and R3 (see Members list and FAQ, CCC overview and White Paper).

Open source projects under the CCC's umbrella include an SGX SDK for Linux, an Open Enclave SDK to build TEE apps that can run across multiple TEE architectures (Microsoft) and Enarx, a platform for TEEs to create and run “private, fungible, serverless” applications (Red Hat). 

In addition, Microsoft already offers confidential computing on Azure, i.e. in the cloud using SGX, while Google offers (although not under CCC) Asylo, an open source framework for confidential computing. AliCloud uses Fortanix, also mentioning SGX. As we're talking cloud, what about Amazon's AWS, you may ask? Well, interestingly AWS is absent from the CCC membership list, but it too, in Dec 2019, had launched "Nitro enclaves" for customers to create "isolated compute environments" to process "highly sensitive data", initially in preview phase.

UPDATE: after I posted this blog, I found further articles on confidential computing so I want to add the links here: about IBM's offering of confidential computing in its public cloud and its launch of fully homomorphic encryption toolkits, and about the use of its confidential computing services (another article), again in the public cloud, by the likes of Bank of America, Daimler and (for healthcare data) Apple.

Use of technologies like confidential computing should stymy or at least deter third party vendors' and/or third country authorities' attempted monitoring or surveillance of data in use - regardless of where the servers conducting the processing are geographically located, i.e. there's no need for data localization in order to protect personal data (or other sensitive data) properly! Couple the use of confidential computing with strong encryption of data at rest and strong encryption of data in transit, and Roberta's your auntie.

What's the catch? This is all relatively new still, so there might well be teething issues. And no doubt confidential computing will be more expensive than "normal" computing, but costs should come down in future as is common with new technologies. I wonder if the CCC will go for a certification for confidential computing under the GDPR's Arts.42-43 in future? Or even Art.46(2)(e) or (f), codes or certifications enabling transfers? (as I've argued before, there shouldn't be a need for "binding and enforceable commitments of the controller or processor in the third country to apply the appropriate safeguards" if they can't even access the data in question. But Art.46 says what it says...).

(Another area worth considering is secure multiparty computation or MPC, but I didn't want to hold up this blog post while awaiting more info on that. There seems to be an industry consortium there too, the Multi-party Computation Alliance.)

Saturday 25 July 2020

Schrems II: data localization, encryption & the bigger picture

The Schrems II decision by the EU's highest court (CJEU) invalidated the EU-US Privacy Shield. It declared valid, just about, SCCs (standard contractual clauses between data sender and recipient) for transfers or data exports outside the EU - but only if there's enough practical checks and controls for "adequate protection" of personal data. The 64 million Euro question being, exactly what checks/controls will be considered good enough?

All that's well known and well discussed. We're eagerly awaiting the promised collective regulatory guidance from the European Data Protection Board (EDPB) on what these additional safeguards or additional measures, which the EDPB calls "supplementary measures", might consist of. Hopefully their guidance will help to avoid fragmentation among national data protection supervisory authorities (SAs).

But let's dissect another proposed "solution": "data localization" (vs. "transfers").

Germany's Berlin SA in particular has taken a very strict approach, stating that:
  • If a "third country" receiving transferred personal data has laws permitting governmental access to the data that goes beyond what EU law permits, then
    • SCCs can't be used for that export, and 
    • Personal data already transferred to that country must be "retrieved" or recovered - i.e. onshoring / reshoring or "repatriation" of that data, if you like; and
  • Controllers who transfer personal data to the United States of America, especially when using cloud services, are now "required" to switch "immediately" to service providers "in the EU" or "in" a country with an adequate level of data protection.
The first point is pretty much what Austria's SA said after the CJEU's 2015 invalidation of Privacy Shield's predecessor, the EU-US Safe Harbour scheme. In fact, the Austrian SA specifically referred to using a company-owned server, a server in an EU member state or another country with an adequate level of data protection. In other words, data localization - storing data only in servers physically located within the territory of an EU or other "adequate" country.

There are actually two separate aspects to the Berlin SA's statements above. They're related but they're definitely not the same thing:
  • Localization of personal data, in terms of the geographical location of physical storage / hosting, i.e. local location of servers and other equipment used to process personal data; and
  • Using only providers "in" the EU or another "adequate" country to host or process personal data. (We'll assume that "in" means "incorporated in", i.e. registered under the laws of an EU Member State or other "adequate" country.)
Let's consider both of these in detail now, by way of a two-way mental debate...

We must store personal data only in servers physically located in the EU or other "adequate" territory. That's the best way to protect it against over-intrusive third countries! And also let's retrieve any data already stored in those third countries, data is a-comin' home!

But - this is the 21st century. There's this thing called the Internet. Organisations physically located in geographic location A can remotely access data stored in geographic location B. In fact, we'd have been in an even worse fix during the Covid-19 crisis if they couldn't. And errr, digital data is actually quite easily copiable. If the third country's already grabbed it (strictly, made a copy of it), then deleting it from your third country storage afterwards or "repatriating" it afterwards isn't actually going to magically delete their copy. Though you clearly think it might stop them from getting their hands on it subsequently, if they haven't already.

OK, so we insist that personal data can only be stored or processed, within EU borders, by EU-incorporated or -registered service providers. And not by processors from over-intrusive third countries, who could be forced by their national laws to access that data remotely from EU servers for disclosure to the third country's authorities. Avoid those risky US providers, away, away with them!

But - even EU-incorporated service providers might want to expand outside the EU. In fact, the EU would quite like them to be successful enough to be able to sell their goods and services abroad, and make money from non-EU countries. Go EU businesses! And some third countries may say: if you want to do business in our country, then you have to comply with our laws. Including remotely accessing personal data physically located in the EU, and giving it to us if we require that. If not, we'll issue criminal proceedings against your directors in our country. They have effective jurisdiction over those EU providers. As does the provider's EU Member State.

OK, so let's tell EU-incorporated businesses not to leave the EU then. Indeed, why not stay within the borders of just Germany? It's a big bad world out there. It's so much safer to shrink down and withdraw, just retreat home.

But - are you sure data insularization (yes I made that up) is a good thing? Digital isolation, cutting ourselves off from the rest of the world? Are you advocating the deglobalization that some have touted since the pandemic? And is that even possible? The way the Internet works, personal data travelling from one EU Member State to another, or even within the same EU Member State, might well transit somewhere outside the EU.

Yes, telecommunications networks' cables carrying Internet traffic in transit could be tapped by over-intrusive countries to intercept data. But if the law of Australia can trump the laws of mathematics, however "commendable" the latter may be, surely the laws of Europe can trump Internet routing, so let's just ban Internet data from transiting outside the EU! Even better, hey, let's just build a Great Firewall of Europe like a recent report for a European Parliament committee recommended, eat local stay local. Like the report says, if we do that: "It would drive competition and set standards, similar to what has happened in China in the past 20 years. The foundations of such a European cloud are democratic values, transparency, competition and data protection." That's what we want, and that's how we want it!

So you really really think that would protect personal data best? And that, umm, this would actually work to stop third countries' intelligence agencies from getting hold of intelligible EU data?

Of course. Data is safest on EU soil, in EU hands. Because third country nation states and cybercriminals would never be able to hack into or otherwise access EU-located data hosted by EU organisations. And EU intelligence agencies would never seek to access EU data so broadly. Or if they did, surely they'd never share any of it with US or other third country authorities. Nuh huh.


Sure, the above is a bit tongue in cheek. And it's certainly not trying to support over-broad US surveillance laws.

But we mustn't lose sight of what should be the first, and ultimate, goal of data protection laws: protection of personal data and privacy/security, particularly confidentiality.

Is the best way to do this really to have the highest EU court tell another (sovereign) country to, effectively, change its own national laws because they're not good enough by another region/country's standards, and put EU (and non-EU) organisations with international or cross-border operations in the impossible position of having to choose which country's laws to break?

The core issue should be, not the adequacy of a third country's laws, but the adequacy of protection for personal data there. Again, those two concepts aren't actually the same thing, even though the first can affect the second.

The GDPR's predecessor, the Data Protection Directive, also restricted "transfers" except to third countries that ensured an "adequate level of protection".

Now, under the UK implementation of the Directive, the UK Information Commissioner's Office (ICO) had previously allowed controllers to conduct their own assessment of the adequacy of protection in a third country:
"Organisations exporting data may be able to ensure that the personal data are protected by means of technical measures (such as encryption or the adoption of information security management practices such as those in ISO27001/ISO27002".
Art.25(2) of the Directive listed particular factors to be considered when assessing the adequacy of protection. It explicitly mentioned "security measures which are complied with in that country". So inadequate laws did not necessarily mean inadequate protection - adequate security measures like encryption, perhaps along with other practical measures, might be enough to overcome inadequate laws in a third country.

In particular, if a third country authority can't access intelligible personal data because it's been encrypted and the authority can't decrypt it, then the risk to data subjects' rights from third country governmental access has surely been mitigated, if not eliminated. The data has been adequately protected against access by those third country authorities. Data subjects don't need rights against third country authorities if their data can't even be read by those authorities. If authorities can't even read intelligible data, then they can't use it or do anything with it against the interests of the data subjects.

In Schrems II the CJEU said:
"It is therefore, above all, for that controller or processor to verify, on a case-by-case basis and, where appropriate, in collaboration with the recipient of the data, whether the law of the third country of destination ensures adequate protection, under EU law, of personal data transferred pursuant to standard data protection clauses, by providing, where necessary, additional safeguards to those offered by those clauses" (para.134).
So, it's gone full circle. We're back to each transferring entity that wants to use SCCs (or Art.47 binding corporate rules i.e. BCRs, according to the EDPB) to send or transmit personal data to, or make it accessible from, a third country, effectively having to make its own adequacy assessment or adequate protection assessment ("APA", to coin an acronym, as "AA" may be a bit ambiguous).

Now, that vital sentence could have been a lot more clearly phrased (can other language versions assist?). How can a transferor "verify" "whether" third country law ensures adequate protection "by providing, where necessary, additional safeguards"? You can't verify laws' adequacy by implementing additional safeguards. But, you can sometimes counteract "inadequate" laws by providing additional safeguards or measures, whether technical or organisational - i.e. by using suitable technologies, policies and/or processes and practices. Like encryption. Surely that must be what the CJEU meant.

When assessing the "appropriate safeguards" under Art.46 GDPR (including use of SCCs) needed to provide Art.45(1) "adequate protection", the CJEU in Schrems II also stated:
"The assessment required for that purpose in the context of such a transfer must, in particular, take into consideration both the contractual clauses agreed between the controller or processor established in the European Union and the recipient of the transfer established in the third country concerned and, as regards any access by the public authorities of that third country to the personal data transferred, the relevant aspects of the legal system of that third country. As regards the latter, the factors to be taken into consideration in the context of Article 46 of that regulation correspond to those set out, in a non-exhaustive manner, in Article 45(2) of that regulation." (para.104).
Guess what? GDPR Art.45(2)(a), on factors that the European Commission must consider when assessing the adequacy of the level of protection in a third country, also mentions "security measures... which are complied with in that country". Just as in the Directive's Art.25(2).

This means that the door is potentially open for the EDPB to allow encryption as a "security measure" that could ensure adequate protection in a third country with otherwise "inadequate" laws. I reiterate, the goal isn't adequacy of laws, it's adequacy of protection for personal data.

So it's heartening that the EDPB in its FAQs on Schrems II stated:
"The EDPB is currently analysing the Court’s judgment to determine the kind of supplementary measures that could be provided in addition to SCCs or BCRs, whether legal, technical or organisational measures, to transfer data to third countries where SCCs or BCRs will not provide the sufficient level of guarantees on their own.
The EDPB is looking further into what these supplementary measures could
consist of and will provide more guidance."
I'm not saying encryption is a panacea. Not at all. Some intelligence agencies undoubtedly have the capability to decrypt certain encrypted data. Or, the encryption applied could be weak - cracked or bad algorithm used, too short a key used. Even with strong encryption, the service provider might have the key because it needs it to in order to process the data as expected by the customer. Most data operations can't be performed on encrypted data, not yet anyway, or if they can it would be unfeasibly slow currently, so the data has to be decrypted first before those operations can be performed. At which point the provider has access to intelligible data. Which can then be disclosed to authorities. And obviously the proper implementation of encryption in practice in concrete situations is very important, along with other security measures like access controls. [Added: however, confidential computing is now available with many cloud providers, that allows encryption "in use" i.e. for computation operations.]

But, if everyone everywhere strongly encrypts data as much as possible, including in transmission as well as in storage, that would make it a lot more difficult for intelligence agencies - admittedly EU as well as third country - to obtain intelligible, usable data. Indeed, in its cloud computing guidance, the Article 29 Working Party (the EDPB's predecessor) recommended that "Encryption of personal data should be used in all cases when “in transit” and when available to data “at rest”". Not just when using non-EU cloud service providers.

To curb excessive state surveillance in practice, those who are against such surveillance should be promoting encryption or other strong forms of pseudonymisation, rather than forcing transferors to analyse and assess the "adequacy" of third country laws. As others have noted, many transferors who use US or other non-EEA service providers are SMEs. Most of them aren't lawyers and can't afford lawyers, let alone expensive lawyers expert in the surveillance laws of all relevant third countries. And what if regulators or courts then disagree with their good faith assessments?

This highlights one problem with the GDPR's drafting, and some lawmakers/regulators' approach to it. To paraphrase Maslow, if all you're used to is nails, then you'll tend to think that a hammer must be the only, or at least the best, tool to use. But, for screws, wouldn't you want to use a screwdriver?

Lawmakers, regulators and judges are used to the tools of law: legislation, regulation, contract, legal obligations and liabilities. Many, dare I say most, of them aren't as familiar or comfortable with the tools of technology, and may not be inclined to trust the efficacy of tools that aren't tools of law.

But my point is that, to protect transferred personal data adequately, we ought to use all the tools available. We can't just rely on third country laws alone (or dismiss their laws as inadequate, so that transfers there are prohibited absolutely).

We need to use technical and organisational measures too to protect personal data, whether the data's physically located in the EU or transferred outside it. Saying that only the tools of law are good enough for transfers, and that technical tools like encryption don't count, would be like trying to protect transferred personal data with one arm tied behind your back, to use yet another analogy.

For transfers, the GDPR should be encouraging, and interpreted as encouraging, the use of technical and organisational safeguards - not just the tools of law. If technical and/or organisational measures can provide adequate protection in a third country, enough to provide "essentially equivalent" guarantees of protection for the relevant personal data and data subjects, then it shouldn't be necessary to make transferors analyse the laws of the third country too!

Read literally, Art.46 states that appropriate safeguards "may" be provided for by SCCs or the other safeguards listed in Art.46(2)(a)-(f). It doesn't actually say that Art.46(2) is exhaustive; it doesn't say that the safeguards listed there are the only possible safeguards.

However, Art.46(1) requires all safeguards to be "on condition that enforceable data subject rights and effective legal remedies for data subjects are available". Even though, logically, they wouldn't need rights or remedies against authorities who can't access intelligible data. That condition isn't necessary to protect data subjects from those who can't access usable data, but it reflects the hammer/nail issue above.

Similarly, if codes of conduct or certification mechanisms get approved for transfers under Art.46(2), "binding and enforceable commitments of the controller or processor in the third country to apply the appropriate safeguards, including as regards data subjects' rights" are still required. Even if the personal data has been encrypted or otherwise pseudonymised in such a way that the third country recipient itself can't access intelligible data, e.g. for pure hosting or storage only. If they want the business, they'll just have to suck it up and agree to those "binding and enforceable commitments", so third country recipients who offer such codes or certifications may still have to sign something like SCCs. (And, to be fair, those commitments could cover other security measures such as backups for integrity, not just confidentiality.)

Let's hope that when the GDPR is updated, I fear probably in another 20 years, Art.46 is amended to delete "and on condition that enforceable data subject rights and effective legal remedies for data subjects are available" and "together with binding and enforceable commitments of the controller or processor in the third country to apply the appropriate safeguards, including as regards data subjects' rights".

To be technologically neutral, which the GDPR was intended to be, data protection laws should simply require adequate protection of personal data and data subjects' rights by whatever means, wherever the data are physically located. In some situations (not all, but some), technical and organisational measures may well provide sufficient protection in practice, so in those situations tools of law like contractual or other obligations on the recipient should not be mandatory as well. Especially if homomorphic encryption, conducting useful operations on encrypted data without needing to decrypt it, ever becomes feasible and fast enough to be workable.

As I hope the above has illustrated, in the absence of "adequate protection" in a third country, even data localization ("Data should not only be stored but also administered elsewhere than in the U.S." as the EDPB put it), will not necessarily be enough to protect personal data, without appropriate technical and organisational measures too. We need a multidisciplinary, cross-disciplinary approach to data protection. Don't tie organisations' hands on this. Tools of law are not the only effective tools. Don't make organisations spend more and more money on lawyers and documentation, share the love and make them spend money on technical security too, which often might actually protect personal data better!

Kuan Hon, Data Localization Laws and Policy
Most of the above and more, is discussed in my book on Data localization laws and policy - the EU data protection international transfers restriction through a cloud computing lens. It predated Schrems II, but discusses the aftermath of the Safe Harbour invalidation in greater detail than above (e.g. German fines for controllers who relied on Safe Harbour for US transfers, but decided to wait and see and didn't implement SCCs immediately).

More importantly, the basic themes, concepts and arguments in the book still remain the same. The EU's approach to international transfers of personal data dates from the 1970s. It needs to be modernised to take account of tech, not just laws. (And, by the way, the transfers restriction was in fact initially intended to prevent controllers from circumventing EU data protection laws by using third country processors. But it's taken on a life of its own to become a Frankenrule, it's been and is being repurposed far beyond its original legislative objective to take potshots at all sorts of things like "inadequate" third country surveillance laws.)

Long story short - dear EDPB, please please don't say that data localization alone is the answer (the Commission itself no longer equates data localization with data protection, quite the contrary). And please please explicitly recognise strong encryption or other forms of strong pseudonymisation as a potential technical supplementary safeguard allowing the use of SCCs (and BCRs) for transfers, even to "inadequate" third countries!

Saturday 11 July 2020

The technical challenges of recording socially-distanced music videos during lockdown!

Now I know why movies / TV shows need large film crews - respect!

Click on the cartoon for a bigger one.

With many thanks to the talented artist who drew this cartoon for me, as I am artistically challenged myself!

Monday 29 June 2020

Children’s consent GDPR Art.8 - Member State differences

Table showing the age below which parental consent is needed (and above which the child’s consent is acceptable) for the offer of information society services (i.e. online services) directly to a child, where the legal basis is consent:

Member State
 Belgium,  Denmark,  Estonia,  Finland,  Latvia,  Malta,  Portugal and  Sweden (and  UK, but we don’t count anymore )
     Austria,  Bulgaria,  Cyprus, 
 Lithuania,  Italy and Spain
 Czech Republic,  France and  Greece
 Croatia Germany Hungary, 
 Ireland,  Luxembourg,  the Netherlands, Poland,  Romania and  Slovakia

These differences are allowed by the GDPR, but the Commission’s Staff Working Document, accompanying its 2-year evaluation ofthe GDPR, comments that “Such differences lead to situations where the Member State in which the controller is established provides for another age limit than the Member States where the data subjects are residing.” You can say that again!

Note: the table above is based on the helpful info provided in the SWD, p.17, and hasn’t been independently confirmed. No info on Iceland, Liechtenstein and Norway was provided in the SWD – presumably because they’re EEA, not EU.

Countries influenced by GDPR

According to the Commission's Staff Working Document accompanying its 2-year report on the GDPR, the GDPR has acted as "a catalyst" for many third countries around the world to consider introducing modern privacy rules":
Brazil, California, Chile, India, Indonesia, Japan, Kenya, South Korea, Taiwan, Tunisia

A map of those countries is below. Click on Larger for the larger version.

Also mentioned in the SWD for "promising developments" regarding privacy legislation, and therefore "third countries" that are possible candidates for future "adequacy" discussions with the Commission:
Malaysia, Sri Lanka, Thailand; Africa (e.g. Ethiopia, Kenya) and  the European Eastern and Southern neighbourhood (e.g. Georgia).