NOTA BENE : This is not a final version (San Francisco, April 3rd 2016)
By Max SCHREMS
In the past weeks I was repeatedly asked by policy makers, MEPs, DPAs and interested lawyers and individuals about a written summary of my assessment of the proposed “Privacy Shield” system. This document is a quick response to these requests. Due to the limited time it may contain some typos and minor errors.
The debate on “Privacy Shield” is ongoing and a full proper academic review of the more than 120 page draft Commission decision, in context of the European and US laws and decisions, is a substantive project outside of the scope of this document, which was written as a citizen over the course of a weekend. This document can therefore only highlight some potential issues identified in summarize an initial examination of the proposed “Privacy Shield” and does not constitute a final or deep review.
The European Commission and the US government, as well as some lobby groups, have extensively promoted the positive sides of “Privacy Shield” and the improvements compared to the previous “Safe Harbor” system. I will not repeat these points in this document. Instead this document focuses on possible problems, shortcomings and issues of the proposed system, to allow overall balanced view.
The level of knowledge varies between persons requesting this document. Unfortunately this means that some elements may be irrelevant, too generalized or explained in very simple terms for experts in the field of data protection and/or EU law.
In the following comments I am primarily (but not exclusively) focusing on a legal analysis. As an initial political comment, I would therefore like to highlight that I am of the view that the EU and the US should reach an agreement that replaces “Safe Harbor”. The aim of case C-362/14 was to create a situation where the political leaders on both sides of the Atlantic have to work towards a new deal that remedies the obvious problems disclosed by Snowden. I unfortunately feel that the current policy makers within the European Commission have not seen this situation as an opportunity to work towards an improved framework that would protect the fundamental right to privacy, but instead as a problem, that shall now be swept under the rug.
1.PRIVATE SECTOR / PRIVACY SHIELD PRINCIPLES
1.1. System of Article 25 and 26 of Directive 95/46/EC
Directive 95/46/EC is based on a simple system: Free flow of data, but only within a sphere of minimal data protection standards. The directive implements this approach within the EEA, but also allows third countries that provide “adequate protection” to “join”.
In simple terms: It would not make sense to block data transfers to countries that have EU-like data protection laws (like Switzerland), but data protection laws are also useless, if data can be transferred to countries where they are not adequately protected.
1.2. “Two Basket” Approach
Consequently, directive 95/46/EC implements a “two basket” system to generally allow data transfers: (1) Controllers in third countries that provide “adequate” protection can access the common market without any restrictions. (2) Controllers in other countries must ensure similar protection though (more complicated) contractual arrangements that basically expand EU privacy protections to third countries. The core question is therefore in which of these two baskets the US should be placed. Up until now, only five non-European countries have been granted direct access to the European market.
Independent from these two options for data transfers, the directive foresees a number of derogations in Article 26 when data must be sent to a third country, for example to fulfil a contract. Despite the common rhetoric, these cases are not examples of cases that can only be solved though a new adequacy decision, like the proposed “Privacy Shield”.
Example: A hotel booking in the US or financial transaction will always be covered by the derogations in Article 26 of Directive 95/46/EC.
1.3. New Standard of “Adequacy” in C-362/14
Article 25(1) of Directive 95/46/EC allows third country transfers only when the third country in question provides an “adequate level of protection”. Some original drafts of the directive used the word “equivalent” protection. In C-362/14 the CJEU has defined the word “adequate” in Article 25(1) to mean “essentially equivalent” protection in the light of the directive and the Charter. This is the primary benchmark for any new Commission decision under Article 25.
“Essential equivalent” is generally seen as providing a much higher standard than “adequate”. Adequate did have little legal meaning, while equivalence now allows simply comparing any law or set of rules with Directive 95/46/EC and the guarantees of article 7 and 8 CFR. If put side-by-side the two sets of rules are not “essentially equivalent”, the Commission cannot make an adequacy finding.
There is an element of flexibility, as the CJEU has highlighted that a third country may use e.g. different means of implementing the level of protection. However, the overall level may not allow circumventing the level of protection within the EEA.
1.4. The “Privacy Shield Principles” (PSPs)
1.4.1. Initial Comments
Given the CJEU’s definition of “adequate” as “essentially equivalent” it would have been reasonable to base new self-certification principles on Directive 95/46/EC or the “Standard Contractual Clauses”. Instead the proposed “Privacy Shield Principles” (hereinafter called “PSPs”) are obviously based on the previous “Safe Harbor Principles” and have even used the exact same wording as the “Safe Harbor Principles”. This approach of basically upgrading “Safe Harbor”, instead of redrafting a system based on Directive 95/46/EC and the judgement C-362/14, seems to have led to a fundamentally inadequate proposal.
In addition, the short-sighted approach by the Commission is further highlighted by the fact that the upcoming European Data Protection Regulation (GDPR) will in many ways raise the level of protection within the EEA. It will therefore be necessary to revisit the question of adequate protection in about two years when the GDPR enters into force. The GDPR uses in essence the same third country transfer rules as the current Directive 95/46/EC and equally requires “adequate protection”. The CJEU judgement in C-362/14 will therefore equally apply.
This short sight approach is especially troublesome, given that the negotiators had every leeway to adopt more restrictive PSPs. Contrary to the “surveillance” issue, there seems to be no legitimate conflicting interest of the US or conflicting laws in the private sector. It leaves the impression that the United States has succeeded in obtaining an unfair competitive advantage for its IT industry and the European Commission was unable to create a level playing field.
1.4.2. Review by Prof. Boehm
In the course of the procedure C-362/14, Prof. Franziska Boehm made a detailed assessment that compared the “Safe Harbor Principles” with Directive 95/46/EC. As most elements of the “Safe Harbor Principles” were copy/pasted and renamed PSPs, I would assume that most assessments done by Prof. Boehm will also be helpful to assess the “essential equivalence” of the PSPs. The review is available at http://www.europe-v-facebook.org/CJEU boehm.pdf.
1.4.3. Examples of inadequate Protection
The PSPs and the attached explanations (previously called “FAQs” to the “Safe Harbor Principles”) must in all details be compared to the relevant rules in Directive 95/46/EC, which would exceed the scope of this paper. Therefore I would like to only highlight some core differences in the two systems:
126.96.36.199. Example 1: Processing Principles
Definition of the “Purpose”
The notion of a “purpose based processing” is the backbone of EU data protection law. Almost all material limitations under Directive 95/46/EC relate to the initial purpose of a processing operation. The key element under EU law is that the purpose has to be “specified, explicit and legitimate”. An overly broad purpose would render the whole system of EU data protection ineffective, as the principles and limitations are based on specific processing operations for specific purposes. The precise definition of a specific purpose is the backbone of the law.
The PSPs do not foresee a specific purpose, but allow for any kind of purposes,1 including absolutely generic purposes like “We use all of the information we have to help us provide and support our Services.”2 All other principles are consequently easily undermined: Data will e.g. always be “relevant” for an overly broad purpose, and never be “incompatible” with such a purpose which renders the first two elements of the “integrity” principle meaningless.
Further Principles of Article 6
If the “data integrity and purpose limitation” principle is further compared with Article 6 of Directive 95/46/EC it seems obvious that the following crucial elements are simply missing:
- The “fair and lawful” processing,
- The “specific, explicit and legitimate” purpose (see above),
- The requirements of “adequate” and “non-excessive” processing,
- The minimization principles (“no longer than is necessary”)
Summary – Processing Principles
Overall the PSPs therefore resemble some elements of Directive 95/46/EC in this respect, but as the PSPs allow the definition of “generic” purposes these limitations are easily undermined.
The vast majority of elements in Article 6 of Directive 95/46/EC are however missing in the PSPs. Only the elements of Article 6(1)(d) seem to be essentially included in the PSPs.
188.8.131.52. Example 2: Legitimacy of Processing
The PSPs are based on an assumption that a controller has an absolute right to process data of any data subject without the need to legitimize any processing operation. Processing of non-sensitive3 data is always legitimate. The PSPs are missing any general requirement to legitimize processing. In other words: The PSP operate on an assumption that “everything is allowed” and “unlimited processing” as a rule.
Contrary to the PSPs, Article 7 of Directive 95/46/EC and Article 8 CFR are based on the concept of a general processing ban, unless one of the limited requirements of Article 7 is met. The PSPs are consequently missing one of the core elements of EU data protection law and Article 8(2) of the Charter of Fundamental Rights.4
Notice & Choice
The PSPs are trying to cover this fundamental non-regulation by introducing a so-called “Notice & Choice” system. However, the limitations in the “Notice & Choice” system used in the PSPs to limit data processing are extremely narrow, so that the vast majority of processing operations that would be prohibited under Article 7 of Directive 95/46/EC are not even regulated under the proposed PSPs.
The “notice” principle merely requires the data subject to be roughly informed about the processing operations, but has no limiting factor. It is comparable to the notice duties in Section IV of Directive 95/46/EC. The same duties are existent under some US state laws. It is therefore irrelevant in respect to the legitimacy of processing.
The relevant “choice” principle allows an “opt-out” of only two specific processing operations (“disclosure” and a “change of purpose”). All other typical processing operations (e.g. collection, usage, storage) that are subject to Article 7 of Directive 95/46/EC are not even remotely subject to consent or any other requirement set forth in Article 7 of Directive 95/46/EC or Article 8(2) CFR. In other words: The vast majority of day-to-day processing operations are absolutely not regulated under the “Notice & Choice” principle of the PSPs.
Practical Issue 1: Undermining “Notice & Choice” even in these two situations
In practice, the “Choice” principle was undermined by the privacy policies of “Safe Harbor” companies. It is to be expected that “Privacy Shield” participants will continue to use the same techniques to undermine the PSPs:
1) Generic Purpose – As explained before the PSPs do not require a “specific” purpose. US
industry is therefore typically using very generic and broad purposes like “We use all of the
information we have to help us provide and support our Services.”5
Under the PSPs, this means that the data subject will typically not have any “choice”, as the purpose is so broad to include any possible future use of personal data. In other words: The purpose is so wide, that there will never be a “change of purpose” necessary.
Practical Issue 2: Cases with no Direct Contact
Some commentators have also raised another practical question: A US controller that directly engages with EU data subjects typically falls directly under Directive 95/46/EC – this is even more true under the “market principle” of the upcoming GDPR. The PSPs will therefore be mainly applicable in B2B data transfers, where the US-based entity has no direct contract with the data subject. This begs the question how a “Notice & Choice” system will operate in practice, when there is no direct contact between the controller or processor and the data subject that should give “notice” and should exercise “choice”.
Summary – “Notice & Choice”
Consequently there is absolutely no limitation under the PSP that would be even remotely “equivalent” to the limitations foreseen in Article 8 CFR and Article 7 of Directive 95/46/EC.
1.5. Other Issues
In addition to the core problems identified above, there are some minor issues that could potentially undermine the Privacy Shield.
1.5.1. Scope: EEA loophole
According to the PSPs, only data streaming from EU member states are covered by the PSPs. For example: The “Notice” principle makes clear that the commitment of certified companies only covers “personal data received from the EU“ and the definition of “personal data” in I(8)(a) of the PSPs is limited to data “received … from the European Union”.
At the same time the proposed adequacy decision by the European Commission would cover any transfer from the entire EEA, including Norway, Lichtenstein and Iceland.
If data is sent e.g. from a Norwegian controller to a “Privacy Shield” company in the United States the principles clearly do not apply while the Commission decision does legitimize the transfer. This obvious mistake makes a challenge to the Commission decision on purely formal grounds possible and can be fixed by amending the PSPs accordingly.
1.5.2. Anti “Right to be Forgotten” Clause
In the “Supplemental Principles” the PSPs foresee “journalistic exceptions” (see Section III.2 of the PSPs) that basically target the CJEU ruling in C-131/12 („Google Spain”).
While the PSPs are limited whenever US law collides with the PSPs, this exception limits the application of the PSPs even further and gives certified companies a “wild card” to ignore the PSPs whenever they can claim protection under the First Amendment of the US constitution. Given the very broad interpretation of the First Amendment, which also covers companies’ right to “speak”, this would mean that certified companies could in many cases violate the right to privacy and data protection guaranteed under EU law. For example, an organization may disclose personal information, whenever it wants to exercise its “freedom of expression” within the rather broad meaning of the First Amendment.
The text of the principles indicates that the PSPs would otherwise conflict with US law. This is not correct. A certified company could easily limit its exercise of the First Amendment in a private contractual arrangement like the PSPs. Many US legal instruments, such as the widely used “non-disclosure agreements”, are based on such contractual limitations.
I doubt that such a “First Amendment wildcard” can function in a system that needs to provide “essentially equivalent” protection as EU law – which naturally includes the CJEU case law in C-131/12 (“Google Spain”).
2 PRIVATE SECTOR REDRESS
2.1. Overall Approach
The enforcement system of “Privacy Shield” seems impressive at first sight. The data subject has about six options to choose from. However, if these options are reviewed in detail it becomes apparent that all but one option do not lead to a right to an enforceable decision.
To reach this one option that leads to a right to an enforceable decision, the data suspect has to first make use of all other remedies. The system is therefore more adequately described as a labyrinth that lets the data subject jump through a large number of hoops.
Example: The steps above would be necessary to get a binding decision
against Facebook Inc. Non-binding or non-enforceable steps in grey.
Numbers [1,2,3…] above relate to the following text.
2.2. Duty to Communicate
First, the data subject has to address the self-certified company . The company has to respond within 45 days (= 1.5 months), which seems an extremely long period for a simple response. Obviously the company can send any response back it pleases. This is therefore not an enforcement option, but a mere duty to communicate by both sides.
2.3. Alternative Dispute Resolution
Secondly, the data subject can refer to an ADR body , chosen and paid for by the company. This is obviously leading to an incentive of ADR body to be company-friendly, which means that the ADR bodies are clearly not “independent”. The procedure does not have to be held in the language of the data subject.
There are very limited requirements for the ADR procedure. Dispute resolution or arbitration is generally not allowed in European consumer contracts (see Letter q of the Annex to the “Unfair Terms” Directive 93/13/ECC). In addition, the ADR body has no right or option to investigate a participating organization. In data protection disputes this would effectively mean that a procedure will be merely based on allegations of both parties, without any realistic option to obtain the relevant evidence e.g. through an on-site investigation of an organization. Typically this means that the data subject will be unable to prove his claims.
In any event, the outcome of the ADR procedure is not binding and not enforceable. Recital 33 of the Commission Decision instead only states that the ADR body must “notify” non-compliance with a decision to the Department of Commerce (DoC) or the FTC [4 and 5].
The FTC has no duty to take action in each individual case. According to Recital 38 the DoC shall only remove an organization from the “Privacy Shield” list as a consequence.
In summary the ADR option is not leading to a binding remedy for the data subject and there is no effective detection and supervision mechanism of which the data subject could make use.
Thirdly, a data subject may refer to the local DPA  to raise a matter to the DoC  or the FTC . Given the potentially numerous requests for referral the DPAs may have an incentive not to be the “post box” for data subjects. In the underlying procedure to C-362/14, the Irish DPA has for example not referred to the FTC, despite a “Memorandum of Understanding” between the DPA and the FTC8.
Recital 45 of the Commission Decision highlights that the data subject can seek judicial redress within the Member State [3a], if the DPA under national law if it does not address a complaint by the data subject. This is however not necessarily reflected in Member States’ laws.
For example the Austrian DPA could only be addressed through an informal “petition” under § 30(1) DSG 2000. It seems questionable if the Austrian DPA has any duty that goes beyond there mere information how it has dealt with the petition that would e.g. include a referral to the FTC or DoC.
It seems that Recital 45 of the Commission Decision is not in line with the relevant Article 28(4) of Directive 95/46/EC. It may very well be that certain DPAs are not adequately referring cases to the DoC and FTC and the data subject has no option to force the DPA to take such an action.
2.4.1. Department of Commerce
The DoC  has pledged to monitor the formal registrations of “Privacy Shield” organizations. But neither the Recitals nor the letters in Annex I to the proposed Decision indicate that the DoC has any powers and intention to monitor the actual privacy practices “on the ground”. The DoC seems to limit its supervision to attempts to keep the “Privacy Shield List” in order, which is a purely formal supervision of a list – not a supervision of the actual processing operations of certified companies.
In addition, none of the documents seem to indicate that the DoC has any real enforcement or investigative powers under the “Privacy Shield” system. The only power of the DoC seems to be, to delist organizations that persistently failed to comply with the PSPs. This does not in itself prevent organizations to further use the relevant data and, above all, does not constitute a remedy for the individual data subject, but is, at most a disciplinary measure.
In summary, the DoC neither has investigative nor enforcement powers to supervise and monitor the factual data processing, to enforce the PSPs on the relevant organizations or to provide for individual remedies. Obviously the DoC is also not “independent” but has e.g. the mission to “expand the U.S. economy through increased exports”9 or “support a data-enabled economy”.10
2.4.2. Federal Trade Commission
The FTC  has pledged to review complaints received via DPAs, while complaints by data subjects will further only go to a database without any individual review. The FTC has enforcement and investigative powers, but has consistently maintained that is cannot and will not enforce each “Privacy Shield” complaint.11 Neither the DPAs nor the data subject has a right or remedy that would ensure that the FTC is addressing complaints on “Privacy Shield” companies.
The FTC is often cited as the US enforcement authority that has enforced countless “Safe Harbor” violations. If this claim is reviewed in detail, it is apparent that the FTC has so far primarily enforced “false claims” of participation, so purely formal violations. Under “Privacy Shield” this will primarily be done by the DoC. In three cases, the FTC has found material violations – all of them were primarily based on Section 5 FTC Act. In all three cases “Safe Harbor” violations were essentially more of a footnote than the core of the cases. All cases were violations “on the face” of the product and did not need any factual investigation. All cases were settled.
In addition, at a first view, the powers of the FTC seem to be limited to subpoenas and testimonies to uncover facts.12 The FTC does seem to lack the power to undertake “raids” or on-site audits, which would allow the FTC to investigate, supervise and monitor the factual processing operations of participating “Privacy Shield” organizations. This aspect seems to require further review to see if the FTC has similar powers as DPAs under Article 28(3) of Directive 95/46/EC. It was however impossible to find any case or law that would indicate that the FTC has powers to do on-site investigations that are crucial in any data protection case.
Especially in comparison with the GDPR, but also under Directive 95/46/EC it remains questionable if the FTC enforcement does meet the standards of European law,13 even if the FTC would grant data subjects a right to a complaint, the FTC’s limitations are considerable.
In summary the FTC may now at least get knowledge of complaints related to the “Privacy Shield” that are referred via DPAs. However, the FTC maintains that it does not grant any right to data subjects to have complaints investigated – let alone remedied.
2.5. Privacy Shield Panel
Fourthly, when the data subject has made use of all the options above, a data subject can finally address the “Privacy Shield Panel” . Given the very complicated multiple steps above, it is very likely that the vast majority of data subjects (who are typically not represented) have given up at this point. In any case, a data subject will only reach the “Privacy Shield Panel” after roughly a year, given the numerous deadlines of 45 or 90 days in the proposal and the unclear deadlines for actions of European DPAs in many member states.
While there are again a number of procedural hurdles and limitations (e.g. location in the United States, only an indirect “video link” for consumers, no compensation of attorney fees for the data subject, no financial compensation awarded, no investigative powers) the main issue with the “Privacy Shield Panel” is (again) that there is no directly enforceable decision or remedy. Again, the only consequence of non-compliance is the duty of the panel to inform the DoC and the FTC, which may take action that do not remedy the rights of a data subject (e.g. delisting).
2.6. Federal Arbitration Act
Finally, recital 47 of the proposed Decision refers to the option to have the decision brought before a US court  under the US Federal Arbitration Act, which would then finally lead to an enforceable decision if an organization fails to comply with the decisions otherwise. It is unclear how long, costly and complicated this final step would be for a European consumer.
2.7. Summary – Redress
If reviewed in detail, the redress options of “Privacy Shield” are not giving the data subject a variety of options to choose from, but send the data subject on a hugely complicated path through at least seven different steps at different institutions.
Especially in comparison to the solution for human resource data, where US organizations are voluntarily subjecting themselves to the jurisdiction of European DPAs, or “Standard Contractual Clauses” that equally try to expand the reach of European DPAs to third countries, it seems questionable why the Commission was unable to reach a system that provides for a direct supervision, investigation and redress mechanism.
The system seems to be designed to mainly make it complicated for data subjects to gain a right to enforceable redress. It is everything but “effective” as required by the CJEU in C-362/14. It violated the spirit of EU consumer protection laws which prohibit arbitration in the consumer context for 23 years. It does not seem to represent any element of Article 8(3) or 47 CFR or is by any means “essentially equivalent” to the system of Directive 95/46/EC.
It may very well be that “Privacy Shield” organizations comply with decisions of an arbitrator in practice to avoid “delisting” by the DoC or a possible enforcement action by the FTC, but from a comparative standpoint this “indirect” system is simply not legal redress or remedy for the individual data subject.
More importantly only the FTC seems to have the necessary powers to even remotely investigate organizations, but the FTC seems to lack a right to do on-site investigations. Even if the FTC had ways to investigate the factual situation, the FTC has refused to address all individual complaints under “Privacy Shield”, which makes this option irrelevant from the perspective of a right to individual redress. None of the other proposed panels, bodies and departments seem to even remotely have the same investigative powers (CJEU: “detection”) of European DPAs under Article 28(3) of Directive 95/46/EC.
The “Privacy Shield” system therefore obviously lacks an “effective detection and supervision mechanisms”14 especially of factual processing operations as required by the CJEU in C-362/14 and does not provide for any “control by an independent authority” as required under Article 8(3) CFR. The proposed systems provides for one way to pursue a legal remedy in the private sector (CJEU, Paragraph 95, C-362/14 “not providing for any possibility”) and it seems very questionable if this way is “effective”. It seems also unlikely that the seven steps to be taken by a European data subject are in any way “essentially equivalent” to EU law.
3.1. Initial Comments
There are a number of arguments that were continuously raised in the debate over “Privacy Shield”. The following paragraphs should address them before diving into the core issues.
3.1.1. Two Requirements
In addition to the “essentially equivalent” test that is derived from the word “adequate” in Article 25 of Directive 95/46/EC, any EU legal provision has to be interpreted in the light of the primary law – in this case, mainly the Charter of Fundamental Rights (CFR). According to Article 53 CFR, the rights under the Charter are to be interpreted in line with the ECHR. In summary, laws of a third country, which violate EU standards of the CFR and the ECHR, cannot be interpreted to be “essentially equivalent” or “adequate” in the light of higher ranking law.
3.1.2. Review on CJEU and ECtHR Case Law
Furthermore, the European Commission and a large number of commentators have primarily focused on the CJEU’s judgement in C-362/14. However a new instrument like “Privacy Shield” must be reviewed in the light of the much broader pool of European case law on surveillance. If a law in the United Stated is e.g. not violating the essence of the Right to Privacy (like in C-362/14), but is simply disproportionate (as in the joined cases C-293/12 and C-594/12 on “data retention”), a decision would be equally invalid under the law. Equally, the ECtHR case law must be taken into account (e.g. the recent judgements in Szabó and Vissy v. Hungary or Zakharov v. Russia. A valid adequacy decision requires comparison with the CJEU’s (and ECtHR’s) case law.
3.1.3. EU Member States Irrelevant
On a regular basis, proponents of the “Privacy Shield” try to ignore this element of the EU legal order and basically argue that if some member states (e.g. the UK, Germany or France) have surveillance laws that are “essentially equivalent” to the laws of the United States, the requirements of the CJEU judgement would be fulfilled. This approach ignores the fact that EU fundamental rights do not directly apply to member states in the area of national security and that the CJEU has interpreted the fundamental rights under the CFR more restrictively than many courts have interpreted constitutional protections in the member states. The current situation in the European Union is similar to the situation in the United States, where the “Bill of Rights” only covered the federal government until the Supreme Court expanded the applicability to the individual states step by step since the 1920s.
In other words: Surveillance systems of some (by far not all) member states may not pass the CJEU’s case law or the CFR requirements, but as the EU lacks jurisdiction in the area of national security, member states cannot be held directly accountable at the CJEU.15 Other member states or the EU at a whole cannot be made responsible for the misconduct of these member states.
3.1.4. Abstract Review
Given that (1) the CJEU has been criticized for basing its findings on a primarily abstract review in C-362/14, mainly holding that the European Commission has not made sufficient findings, (2) the generally very murky world of intelligence and (3) the fact that the proposed “Privacy Shield” again lacks any clear description of US surveillance programs, it seems necessary to highlight that European law (see ECtHR case law) is generally assessing such laws in the abstract. The ECtHR assumes that basically everything a law allows is done. This is the only way secret surveillance can be challenged and avoids the “standing” issue known from US law. In addition “internal” and non-public guidelines or rules are not recognized under these standards.
In the absence of detailed factual information from the US government and the abstract approach of European law, I would therefore like to primarily focus on the abstract legal situation under US law for this document.
3.1.5. Factual Access / Definitions
The US government does not clarify the exact factual activities it takes in the area of surveillance. Terms like “collection”, “acquisition”, “targeting”, “collection in bulk”, “use” or “search” are used contrary to their ordinary meaning. For example, “collection” seems to be used as the final search operation of an official within a system, not the initial “collection” of data from a service provider.
It seems that, under US definitions, a surveillance system that collects massive amounts of data is described as “targeted”, as long as an official has to “search” for individual data within the system through a “selector”, like an email-address, telephone number or name.
Under this definition even Google Search of a person would be “targeted”, despite the fact that it gives a user the ability to access almost the entire public internet.
In contrast, Article 8 CFR uses the word “processing”, which is to be interpreted in line with Article 2(b) of Directive 95/46/EC. In the case of private-public mass surveillance, this especially includes “making [data] available”. A data subject must not prove that their data has in fact been “pulled” or even reviewed, used or analyzed by the US authorities, but merely that is has been “made available” (e.g. through a “back door” or some other software interface as required under 50 USC § 1881a (h)(1)(a) “facilities … necessary to accomplish the acquisition”). This is also in line with the rationale of the CJEU case law in C-293/12 and C-594/12, where merely the “collection” (the step before “making [the data] available”) was seen as a disproportionate violation of the CFR.
Contrary to the US Fourth Amendment and US definitions, European data protection and privacy law covers a very wide field of processing operations. This allows avoiding the need to speculate about details of US surveillance programs, which the US government does not disclose and limits any issues around standing. Within this paper the common European terminology will be used.
3.1.6. Factual Number of Targets
Recital 69 of the draft decision the European Commission noted that “access requests through NSL and under FISA … only concern a relatively small number of targets when compared to the overall flow of data on the internet.” On page 11 of Annex VI to the draft decision, Robert Litt (General Counsel of the ODNI) refers to information released by the US government of about 90,000 individual targets. At the same time the FISC noted on page 29 of a disclosed ruling that the “NSA acquires more than two hundred fifty million Internet communications each year pursuant to Section 702”.16
In any event, the legal value of such numbers seems limited: First, the CJEU has found that even in a case like the Austrian retention of meta data, that concerned only 326 access requests17 and 312 actual accesses per year, was disproportionate. Secondly, the statement is limited to FISA and NSL accesses, which means the statement does not cover programs under EOs and other any other legal bases. Thirdly, it remains unclear what kind of processing (e.g. “making available”, “storage” or access by a person) is counted.
The legal quality of the “letters” attached to the draft decision seems questionable. Annex VI in particular does not seem to amount to a “commitment” or a document that “ensures” the protection of data subjects within the meaning of Article 25(6) of Directive 95/46/EC. When the Commission portrays Annex VI as “assurances” (Recital 69) it is clearly misrepresenting the text.
The letter by the General Counsel of the Office of the Director of National Intelligence (basically a lawyer of the DNI) seems to be better described as a legal summary or a legal opinion. Consequently the letter by Robert Litt clarifies on page 1 of Annex VI that the document “summarizes the information that has been provided” by the United States. There is obviously a substantial difference between “information” that is provided to the Commission and “commitments” or “insurances” under the domestic law.
The letters seem to mainly function as a “filter”: Only the positive elements of US laws and practices are mentioned in them and attached to the Commission decision.
Like any government, the US is not highlighting criticism and loopholes in its laws and practices. Equally the letters from the US government often seem to use rather generic and misleading claims on US laws and practices. This includes for example statements about the Fourth Amendment,18 which does not apply to non-US persons or one-sided representation of the proposed “Privacy Shield” and US surveillance laws.19
For a solid assessment, it is therefore unfortunately necessary to review, the exact laws, executive orders and not rely on the representations of the US government alone. It is necessary to “add” the non-existent protections in certain situations or obvious limitations of the mentioned protections to get an adequate picture.
3.1.8. Other Laws
It is necessary to highlight that the US legal order is much more complex and includes the laws of 50 states and many different areas of law. Many other laws may allow or even require data sharing that could potentially go beyond what would be acceptable under European law. Examples include the “Cybersecurity Information Sharing Act” or recent plans of the US government to allow the sharing of raw data intercepted by the NSA with other government agencies.20 It is unclear if any of these other laws or executive orders and practices may conflict with EU law, but it seems that a much broader review would be necessary to ensure that “Privacy Shield” will not be subject to a challenge in the light of any of these US provisions.
It is the core shortcoming of a “self-certification” – and indeed any contractual system – that this approach requires a full review of any possible conflicting legal obligations. Especially the lack of any baseline privacy rights (e.g. under constitutional law) make this task almost impossible with respect to the United States.
3.2. General Exception in “Privacy Shield”
Just like the “Safe Harbor Principles” the proposed PSPs include the very same general exception for any conflicting US law, regulation or court order: “Adherence to these Principles may be limited: (a) to the extent necessary to meet national security, public interest, or law enforcement requirements; (b) by statute, government regulation, or case law that creates conflicting obligations or explicit authorizations”.21 It is crucial to highlight that limitation of this clause (“…limited to the extent necessary…”) only covers “authorizations”, not “obligations” under US law. This means that whenever there are any obligations under US law (even just a “city ordinance”), the PSPs do not apply. It follows from this provision that all US statutes, government regulations and case law must be reviewed, to assess if the proposed “Privacy Shield” is complaint with Article 25 of Directive 95/46/EC and the CFR.
3.3. Section 702 & EO 12.333
This papers document focuses on 50 U.S.C. § 1881a (“Section 702”),22 relevant for the “PRISM” program and EO 12.333, the primary basis for surveillance outside of the US territory. It remains unclear if there are any further legal bases for mass surveillance by the US government.
The US government and the European Commission have partly highlighted improvements in other areas, such as the changes to “Section 215” surveillance in the “USA Freedom Act”. However these laws were not primarily focusing on data stemming from the EU and were not the main concern from a European perspective. Similarly the “Umbrella Agreement” and the “Judicial Redress Act” are not concerned with the programs and actions under Section 702 or EO 12.333. While improvements in all areas are surely to be welcomed, these changes are largely irrelevant for “Privacy Shield”.
Section 702 for example continues to allow the US government to directly “tap” into the servers of “electronic communication service providers” within the United States. Under § 1881a (g), the Attorney General and the Director of National Intelligence can provide the FISC with a so-called “certification” that describes a surveillance program, of which “a significant purpose of the acquisition is to obtain foreign intelligence information” (see § 1881a(g)(2)(A)(v)) but which is not “required to identify the specific facilities, places, premises, or property at which an acquisition … will be directed or conducted” (see § 1881a(g)(4)). Under this general “certification” of a surveillance program, the US government can issue a “directive” (see § 1881a(h)) that requires a service provider to “immediately provide the Government with all information, facilities, or assistance necessary to accomplish the acquisition” (e.g. a “tap” or “back door”). The term “foreign intelligence information” is very wide and includes even “information … that relates to … the conduct of the foreign affairs of the United States”, in other words: espionage (see § 1801(e)).
It is crucial to understand that the law does not target “suspects” but “foreign intelligence information” – no matter where it is stored or which person it concerns. The law is designed around traditional concepts like “probable cause” or “reasonable suspicion”. Limitations in the law (“targeting and minimization”) are designed to “filter” information concerning US persons. There is no limitation on access to data of non-US persons. The judicial oversight via the FISC is approving entire surveillance programs, not access to the data of individual suspects.
The core difference in US law the European Commission relied on in the proposed “Privacy Shield” decision is the coming into force of PPD-2823. According to the European Commission, this document is something like a “game changer” that renders the initial assessment of the European Commission from November 2013 in COM (2013) 846 final irrelevant.
3.4.1. Formal Status of PPD-28
First, I would like to address common doubts about the legal quality of PPD-28. As far as I was informed by US lawyers, PPD-28 is legally binding with the relevant executive bodies and it also leads to directly enforceable third-party rights. This would need further confirmation, but if this information is correct, a data subject could fully rely on this document in a court procedure.
The fact that the United States is currently almost unable to pass new laws, and the US government is therefore relying on “executive orders” does not seem to be problematic from an Article 25 or CFR standpoint. Executive decision can be changed just like national laws in most parliamentarian systems. Despite much criticism on the formal value of PPD-28, I therefore do not seeing any real issue with the formal quality of PPD-28.
As the CJEU accepted “self-certification” under Article 25 of Directive 95/46/EC, it seems reasonable to accept an executive order that provided third party rights as well. Case law of the ECtHR does equally aim at the availability, the precision and other elements of a binding text in these cases – not the status as a “law” or another legally binding rule.
3.4.2. Aspirational and Irrelevant Content of PPD-28
In its analysis of “limitations”, starting at recital 56 of the draft decision, the European Commission does not seem to separate merely aspirational elements of PPD-28 that do not limit the operations of the US intelligence services, but are rather general political statements with no legal value, from elements that do create certain limitations.
Statement like “all persons should be treated with dignity and respect”, “all persons have legitimate privacy interests” “the United States shall consider the availability of other information” or “civil liberties shall be integral considerations” are not “limitations” but political verbiage. None of these provisions cited for example in recital 57 of the draft decision limit US intelligence authorities from conducting limitless bulk surveillance.
Equally the European Commission and the US government are taking great pride in highlighting, that the US government “must be based on statute or Presidential authorization” and that the US government must operate in line with the US constitution.
In fact this only restates a very basic requirement in any constitutional system. But taking into account that the Bill of Rights does not apply to non-US persons and the broad and sweeping authorizations in Section 702 or EO 12.333, this statement has practically no value. PPD-28 could equally say “US laws allows us to collect virtually all data, and we stick to these laws”. The core problem uncovered by Edward Snowden was not only that the US is using massive surveillance programs, but that all of them are in fact legal under current US law. Restating this fact is not a “limitation”, but a confirmation of the concerns that lead to the decision in C-362/14.
Finally, the numerous unsettling programmatic aims expressed in PPD-28 (like “the collection of signals intelligence is necessary for the United States to advance its … foreign policy interests” or “the United States must … continue to develop a robust and technologically advanced signals intelligence capability”) did not find their way into the assessment of the European Commission. Not taking these elements into account could raise questions about a failure to adequately justify the decision.
3.4.3. Definition of “Bulk Collection” / Exception for “Bulk Storage”
Recital 59 of the draft decision seems to mention the definition of “collection in bulk” in Footnote 5 of PPD-28, but does not seem to take the limited definition into account.
According to Footnote 5 of PPD-28 “the limitations contained in this section do not apply to signals intelligence data that is temporarily acquired to facilitate targeted collection. References to signals intelligence collected in “bulk” mean the authorized collection of large quantities of signals intelligence data which, due to technical or operational considerations, is acquired without the use of discriminants (e.g., specific identifiers, selection terms, etc.).”
This means that the United States confirms that it collects, acquired and/or stores content data in total absence of the limitations in PPD-28. Whereas the scope of this definition is not further explained, its existence raises serious doubts about “limitless” collection. This comes with an unspecified time frame (“temporarily”). It could be interpreted in the sense that all data collected in bulk could be stored for a period of time until some form of “targeting” is conducted.
As the “limitations” of PPD-28 do not apply to “bulk storage” of content data within the reach of the US government, it seems that Section 2 of PPD-28 does not even limit Section 702 collection as the US government maintains that the data is at some stage “targeted” by “selectors”. Under this definition Section 702 collection is not even “collection in bulk” and not limited to the six cases listed in Section 2 of PPD-28.
In any event, Section 2 of PPD-28 does not limit Section 702, EO 12.333 or any other legal basis in a way that they would be in line with the CJEU rulings in the joined cases C-293/12 and C-594/12. In these cases the mere collection and storage of meta data outside of the reach of European governments (with later possibilities of access) was found to be disproportionate under the CFR. Section 2 of PPD-28 does furthermore allow unlimited acquisition of content data.
3.4.4. Limitations on “Bulk Collection”
Even in the cases that are not exempt from Section 2 of PPD-28 through Footnote 5 (non-temporary bulk acquisition of data), the US government reaffirmed that it collects this data in bulk. PPD-28 further affirms that “the collection of signals intelligence in bulk may consequently result in the collection of information about persons whose activities are not of foreign intelligence or counterintelligence value.”24
While there is obviously no limitation on the collection and storage of such data, the “use” (and only the “use”) of this type of information is limited to the purposes of detecting and countering:
- espionage and other threats and activities directed by foreign powers or their intelligence services against the United States and its interests;
- threats to the United States and its interests from terrorism;
- threats to the United States and its interests from the development, possession, proliferation, or use of weapons of mass destruction;
- cybersecurity threats;
- threats to U.S. or allied Armed Forces or other U.S or allied personnel; and
- transnational criminal threats, including illicit finance and sanctions evasion related to the other purposes named in this section.
Many of these six purposes seem to be very broad and not precisely defined and the aim of “detection” of “threats” allows relatively unlimited pre-emptive use of anyone’s data.
In addition to this positive list of permissions, PPD-28 also lists a couple of prohibitions:
- suppressing or burdening criticism or dissent; disadvantaging persons based on their ethnicity, race, gender, sexual orientation, or religion;
- affording a competitive advantage to U.S. companies and U.S. business sectors commercially (while according to footnote 4, espionage for some economic purposes are allowed);
- or achieving any purpose other than those identified in this section.
PPD-28 further strives to limit the storage period to 5 years, “unless the DNI expressly determines that continued retention is in the national security interests of the United States” and sets forth some limitations on the dissemination of data.
None of these limitations apply to “bulk collection” that is later targeted as described in footnote 5. Given this obvious situation under US law and the clear wording of PPD-28, it seems absolutely inexplicable how the Commission can find in recital 69 of the draft decision that “insofar as personal data to be transferred under the EU-U.S. Privacy Shield are concerned, these authorities equally restrict public interference to targeted collection and access” when Annex VI says that there is admitted “bulk collection” in six cases and more data gathering under Footnote 5.
3.5. Violation of the Essence / US Assurances
Given the clear wording of FISA, EO 12.333 and PPD-28 under which (A) the US government obviously has access to all data stored with the relevant service providers and transmitted through the internet, (B) the US government confirms “bulk collection” and “bulk use” of this data for at least six purposes and (C) collects data and later “targets” for any other purpose, the situation under US law falls clearly within the “essence” of Article 7 CFR:
94 . In particular, legislation permitting the public authorities to have access on a generalised basis to the content of electronic communications must be regarded as compromising the essence of the fundamental right to respect for private life, as guaranteed by Article 7 of the Charter (see, to this effect, judgment in Digital Rights Ireland and Others, C-293/12 and C-594/12, EU:C:2014:238, paragraph 39).
The US has confirmed that the US authorities have “access” on a “generalized basis” (“bulk collection and bulk usage” under Section 2 of PPD-28) of the “content of electronic communications”. All elements of the CJEU judgement are clearly fulfilled in these cases.
Even in the cases that only fall under Footnote 5 of PPD-28, it seems that, if a US official can simply “target” e.g. an email-address or phone number to receive the relevant content data, he would be considered to have “access on a generalized basis” to the content of electronic communications. Personal data is simply put at the fingertips of the US government.
3.6. CFR Proportionality Test / US Assurances
The European Commission seems to have ignored the finding of the CJEU in paragraph 94 of C-362/14 and only assess the “proportionality” of US law. For the sake of completeness this view of the Commission is analyzed below.
In recital 63 of the draft decision the Commission finds that the principles of PPD-28 and US law “capture the essence of the principles of necessity and proportionality”.
In recital 75 the Commission starkly “concludes that there are rules in place in the United States designed to limit any interference for national security purposes … to what is strictly necessary to achieve the legitimate objective in question”.
The following elements would have to be fulfilled under the long-standing CJEU case law:
- There must be a legitimate aim for a measure
- The measure must be suitable to achieve the aim (“effective”)
- There cannot be any less onerous way of doing it (“necessary”)
- The measure must be reasonable (“proportionate”)
It seems questionable if some of the “aims” are legitimate, such as espionage on the EU and the member states. There is also a broad debate on the suitability of mass surveillance for some of the aims of the United States. Both issues would go far beyond the scope of this document
More concretely, the relevant US law does on its face not (1) limit any interference to minimum necessary and (2) fulfill the “reasonableness” benchmark of the CJEU case law.
The necessity test is obviously not implemented in US law. In relation to Section 702 collection 50 U.S.C. § 1801 (e) holds the key definition of “foreign intelligence information”:
(e) “Foreign intelligence information” means (…)
(2) Information with respect to a foreign power or foreign territory that relates to, and if concerning a United States person is necessary to (…)
(B) the conduct of the foreign affairs of the United States.
Section 702 (50 U.S.C. § 1881a) does not seem to have any further limitation. While the “necessity” test is implemented for US persons, data must only “relate” to the “conduct of the foreign affairs of the United States” if the data concerns non-US persons.
In a similar fashion, PPD-28 requires that “bulk use” of data is only used for certain purposes – it does however not require that the data is “necessary” for these purposes. The US government may use totally irrelevant data in totally irrelevant ways under each of these provisions, as long as the irrelevant data is not used for other purposes.
In recital 63, the Commission claims that more targeted forms of surveillance “is clearly prioritized” under Section 1, Paragraph d) of PPD-28. If PPD-28 is adequately analyzed it does not implement a “necessity” test, but bulk surveillance should only be “as tailored as feasible” and the US should “consider the availability of other information” and the bulk surveillance should (not must) be prioritized. In addition, this provision obviously applies to the overall design of “signals intelligence activities” – but not necessarily to individual cases.
When the Commission equally claims that the limitation to certain (very broad) purposes for usage of data that was collected in bulk, would “capture the essence of the principles of necessity and proportionality” it obviously misinterprets the “legitimate aims” (the first level of a proportionality test) with the next three steps of a proportionality test.
The mere limitation to (1) six aims and (2) the prioritization of more targeted surveillance programs is all miles away from “necessity” within the meaning of the CFR. However the Commission relies on only these two elements when making this finding in Recital 63.
The issues become even more obvious when each stage of the surveillance measures (like acquisition, storage and search) is analyzed individually.
The “reasonableness” test is typically a very subjective undertaking. If one is convinced that US law passes all previous levels of a proportionality test, it would be necessary to compare US surveillance with previous case law of the CJEU and ECtHR to reach an objective result. For this the following key points would be relevant:
- The measures are aiming at very broad pre-emptive purposes such as the “the conduct of the foreign affairs of the United States”, “ cybersecurity threats” or “transnational criminal threats”.
- US laws require “mass collection” of content data or at least “making [the data] available” on a massive scale via programs like PRISM or UPSTREAM.
- In six cases, Section 2 of PPD-28 even confirms “mass use” of private communications in “bulk”. In these six cases, there seems to be few, if any, limitations. To the contrary, the US government even admits “bulk usage” of private content data in these cases.
- There is no limitation under Section 2 of PPD-28 in the “collection” and “storage” phase.
- The law is not limited to meta data, but includes all personal data, like private messages, content data of all kinds or sensitive accounts data like passwords.
- PPD-28 generally limits storage of data to five years, with a vague exception.
- Further usage of such data for other purposes is explicitly allowed in a variety of cases (see Section 4(a)(i) of PPD-28 and Section 2.3. of EO 12.333).
If only these elements (there may be many more) are e.g. compared to the CJEU ruling in the joined cases C-293/12 and C-594/12 on “data retention”, is seems impossible to take the view, that US law “captures the essence of the principles of necessity and proportionality” – let alone that it is actually pursuing a legitimate aim and is necessary and proportionate under the CFR.
If one would take this view, it would automatically mean that the European Union and EU member states can pass similar laws, even when they fall under EU jurisdiction and the CFR. The view expressed by the Commission in the draft decision therefore raises considerable concerns, as it could be used as a legal precedent for the legality of such far reaching surveillance practices, even where European law applies.
4 (5). SURVEILLANCE REDRESS
5.1. Review versus Redress
The US government and the European Commission went to great lengths to highlight various forms of oversight of US surveillance programs, ranging from oversight within the executive branch, the legislative branch and the judiciary. Many of these oversights are considerably overstated in the relevant documents. As an example: The FISC (“FISA Court”) only approves entire surveillance programs on an annual basis and does not decide on individual targets. The overview by the political branch in the US is in practice mainly concerned with surveillance of US persons and does not represent non-US persons.
Even if these oversight mechanisms were to produce meaningful limitations of US foreign surveillance, the CJEU was concerned with judicial redress of an individual, highlighted especially in Paragraph 95 of the judgment in C-362/14. None of the “oversight” systems allows a data subjects whose data were transferred from within the EEA (for example a Chinese person who uses a European service) to get any form of judicial redress. To be clear: Even US persons typically do not have any such rights or are unable to invoke them, because of “standing” issues.
Instead of any judicial redress, a tribunal or some other form of independent body, the United States has proposed an “Ombudsperson” within the State Department (the US foreign ministry). The Ombudsperson can refer matters to the various oversight bodies within the US government and raise issues internally. She does not have any power to decide or enforce matters herself.
A data subject cannot directly address the Privacy Shield Ombudsperson. The DPAs do not have a legal duty to refer matters to the Ombudsperson under Article 28(4) of Directive 95/46/EC (see above 3.4). If a DPA does not feel that a case is to be referred, it seems that a data subject does not have any remedy under European law. Consequently a complaint by a data subject may in certain cases not even reach the Privacy Shield Ombudsperson.
The Ombudsperson (Catherine Novelli25) remains Under Secretary of the State Department26, which is simply a third-layer head of department in the US foreign ministry. While the draft decision highlights, that the Ombudsperson is independent from the intelligence services, she is obviously a normal element of the US administration. This is not just miles away from the “control by an independent authority” as required under Article 8(3) CFR but in no way could be considered to be a system that would provide a “right to effective judicial protection, as enshrined in Article 47 of the Charter” as required by the CJEU in paragraph 95 of C-362/14.
5.2.3. Response by the Ombudsperson
The most obvious shortcoming of the Ombudsperson is the response it will provide to the individual data subject via the relevant DPA. The response consists of three elements:27
First, the Ombudsperson will confirm that the complaint has been properly investigated.
Secondly, it will neither confirm nor deny whether the individual has been the target of surveillance nor will the Ombudsperson confirm the specific remedy that was applied.
Thirdly, the Ombudsperson will confirm that the U.S. law, statutes, executives orders, presidential directives, and agency policies, providing the limitations and safeguards described in the ODNI letter, have been complied with or, in the event of non-compliance, that such non-compliance has been remedied.
Interestingly Paragraph 4(e) of Annex III can be read to say that the Ombudsperson will not even say if US authorities have complied with the law, but merely that either they have, or the situation was remedied. This “response” therefore amounts to a simple “standard letter” no matter what the Ombudsperson and the US government has or has not done. There is no way to exercise scrutiny or to review such a letter. No matter what the situation may have been, the Ombudsperson will always respond in the same way.
5.2.4. Summary – Ombudsperson
The “Ombudsperson” may be able to forward concerns to the relevant US agencies and thereby remedy issues in practice. However, the overall function is best described as a mere “post box” that forwards concerns. Under the provision in Paragraph 4(e) of Annex III, the response does not amount to more than a “standard letter” – no matter what the fact of a case may have been. If this is compared to the requirements expressed by the CJEU in C-362/14, there is no doubt that “Privacy Shield” is not in line with the judgment:
95 Likewise, legislation not providing for any possibility for an individual to pursue legal remedies in order to have access to personal data relating to him, or to obtain the rectification or erasure of such data, does not respect the essence of the fundamental right to effective judicial protection, as enshrined in Article 47 of the Charter. The first paragraph of Article 47 of the Charter requires everyone whose rights and freedoms guaranteed by the law of the European Union are violated to have the right to an effective remedy before a tribunal in compliance with the conditions laid down in that article. The very existence of effective judicial review designed to ensure compliance with provisions of EU law is inherent in the existence of the rule of law. (…)
5.3. Right to Access
In addition to the right to redress, the CJEU highlighted the need for “legal remedies in order to have access to personal data relating to him” (C-362/14, paragraph 95).
The proposed “Privacy Shield” seems to address this matter by reference to the US “Freedom of Information Act” (FOIA) – it is not a “right to access” as foreseen under EU law, but allows access to government documents in practice. However, Annex III to the draft “Privacy Shield” decision also clarifies that the FOIA has a number of limitations to the right to obtain government information. Typically these exceptions will apply in any case of US surveillance. The right to access is subject to limitations in Directive 95/46/EC as well, but must be limited in a proportionate way, in line with the CFR. It therefore seems questionable if “Privacy Shield” is in line with C-362/14 in this respect and further review of the US FOIA would be required.
6.RIGHTS OF EUROPEAN DPAs
In recital 44 and Article 3 of the draft decision, the European Commission seems to grant European DPAs unfettered rights to suspend data flows under Article 28(3) of Directive 95/46/EC.
If individual DPAs can suspend data flows under “Privacy Shield” based on their individual assessment of the level of protection afforded in the United States, the whole rationale of an adequacy decision under Article 25(6) of Directive 95/46/EC, namely legal certainty and consistent rules for third-country transfers, would be undermined.
Some representatives of the Commission have argued at public hearings that this is only limited to the suspension of transfers that are not based on an inadequate protection in the United States under “Privacy Shield”, which does not seem to be reflected in the text of the draft decision. On the other hand and on other occasions, this exception was argued to be implementing the CJEU ruling in C-362/14 (see especially paragraphs 51 to 66).
It may be helpful to understand that the fact pattern in C-362/14 was an unfortunate combination of national Irish law (that allowed the DPA to ignore a complaint when it found it to be “unstable in law”) and EU law.
To my personal understanding, the CJEU has quite clearly held, that a DPA must be able and has a duty to investigate a complaint when a Commission decision is challenged. Equally a Commission decision can allow DPAs to also suspend data flows in specific situations (like foreseen in the very narrow Article 3 of the “Safe Harbor” decision, as well as in other adequacy decisions). However a DPA is still bound by the decision until it is declared invalid by the CJEU. This is apparent from paragraph 52 of the CJEU judgement in C-362/14.
Including an exception in adequacy decisions for specific extreme situations (called an “emergency exit” in the CJEU hearing) does not seem to be required by law, but is clearly a wise choice. Such an “emergency exit” allows DPAS to cope with extreme situations without the need to invalidate the entire adequacy decision. Not regulating or limiting this “emergency exit” is however calling the sense and stability of the whole adequacy decision into question.
2 Facebook’s Data Policy
3 For reasons of simplicity the limitations on sensitive data (opt-in) are not discussed here.
4 Article 8(2): “data must be processed … on the basis of the consent of the person concerned or some other legitimate basis laid down by law“
5 Facebook’s Data Policy
6 Facebook’s Data Policy
7 See PSPs, Clause 9(b)(i)
11 Multiple statements by Julie Brill (FTC) in the Privacy Shield debate.
13 It is important to mention, that some EU member states may currently not meet the enforcement standards set forth by the CJEU in C-362/14.
14 C-362/14, Paragraph 81.
15 There are other remedies and legal paths. See the numerous cases pending at the ECtHR in Strasbourg.
16 https://www.eff.org/sites/default/files/filenode/fisc opinion -unconstitutional surveillance 0.pdf
18 For example page 1 and 2 of Annex VII
19 For example in Annex IV
21 Annex II, Paragraph I.5
23 https://www.whitehouse.gov/sites/default/files/docs/2014sigint mem ppd rel.pdf
24 PPD-28, Page 3.
25 https://en.wikipedia.org/wiki/Catherine A. Novelli
26 https://en.wikipedia.org/wiki/United States Under Secretary of State
27 Annex III, Paragraph 4(e)