The data protection regime applying to the EU inter-agency cooperation and future architecture of the EU criminal justice and law enforcement area

(*) The full study for the European Parliament LIBE Committee can be downloaded HERE

By Professor PAUL DE HERT

EXECUTIVE SUMMARY

From a data protection perspective, fragmentation is the main characteristic of the legal framework in place in the agencies in the EU criminal justice and law enforcement area .
A multitude of EU agencies operates under their own individual legal framework with little regard for harmonization , consistency or even compatibility among their personal data processing , while the basic text that would supposedly set the common standard in the field , the Data Protection Framework Decision, expressly excuses itself from assuming this role.
Each one of the EU bodies and agencies operating within the EU criminal justice and law enforcement area is until today governed by its own legal constituting text (s) that customarily address data protection issues but however does so in a piecemeal and introverted way: supervision of data protection practices is vested upon each agency’s internal mechanisms and management. This architecture, that reflects the pre-Lisbon third pillar environment, has been preserved until today, despite of the fact that in the meantime interagency cooperation has proliferated: not only have formal bilateral cooperation agreements been entered among all EU agencies but also cooperation takes place outside EU borders as well , through chartered, or unchartered, personal data exchanges with third countries and international organisations.
Adequate data protection supervision, in the sense of a single, coordinated monitoring authority, is emphatically missing from all such exchanges.

The ratification of the Treaty of Lisbon is a milestone that affected the EU criminal justice and law enforcement area in more than one way. Among others, the culmination of a standalone individual right to data protection and the involvement of the European Parliament in any decision – making in the field are crucial factors that enabled an, admittedly much needed, change. Such change came in the form of a series of Commission proposals that were released over the past couple of years and which, if implemented, will completely restructure the current EU data protection architecture in the criminal justice and law enforcement area.
The Commission proposals originate from Article 16 TFEU, which introduces a new right to data protection and requires new rules on the personal data processing by EU agencies , as well as independent monitoring, but also from Declaration 21, which allows f or “specific rules” in the field.

To this end, the Commission introduced both general and agency-specific texts.
At a general level, a Police and Criminal Justice Data Protection Directive is intended to replace the Data Protection Framework Decision. At agency-specific level, the Europol and Eurojust draft Regulations are intended to replace the respective Decisions in force today; at the same time a new Regulation is aimed at introducing the European Public Prosecutor’s Office (EPPO) while work has been promised by the Commission also on amending Regulation 45/2001.
Such law-making process entails herculean efforts by all the bodies involved in it (the Commission, the Parliament and the Council) in order to keep the overhaul of data protection rules in force today (in the EU criminal justice and law enforcement field) synchronized and coordinated .

Although none of the above legislative proposals is yet finalized (in fact, only one has reached “trilogue” stage), the Commission’s preferred data protection architecture has become by now evident: the draft Directive is to replace the Framework Decision but not to affect any agency – specific personal data processing. This task will be undertaken by Regulation 45/2001 (or its successor) and the European Data Protection Supervisor (EDPS).

This architecture is basically taken for granted for the purposes of this analysis: regardless of its merits or drawbacks, other than the Commission also the Parliament has shown no substantial objection to it.

Therefore, the interplay of the instruments involved (the Police and Criminal Justice Data Protection Directive, Regulation 45/2001 or its successor, the Europol, Eurojust and EPPO Regulations) has been attempted to be sketched in the six different scenarios that follow , each in turn assessed in terms of legal and pragmatic plausibility under the current environment:
• A “unified model” scenario, under which the Police and Criminal Justice Data Protection Directive would regulate all the EU criminal justice and law enforcement area (including therefore the EU agencies operating therein);
• A “segregated model” scenario, whereby the Police and Criminal Justice Data Protection Directive would leave EU agencies’ personal data processing outside of its scope (as is currently the situation under the Data Protection Framework Decision ) ;
• An “interim segregated model” scenario, under which the above segregated approach would only last for a few years, after which EU agencies would have to bring their personal data processing under the Police and Criminal Justice Data Protection Directive;
• An “alternative unified model” scenario, that, as originally suggested by the Commission, would use Regulation 45/2001 as a common standard – setting text for all EU agencies, whose individual constituting legal instruments would subsequently supplement and further specify its provisions;
• A scenario whereby the current architecture is preserved and consequently neither the Police and Criminal Justice Data Protection Directive nor Regulation 45/2001 (or its successor) affect in any way the agency – specific (revised) texts, and
• An, unfortunately likely for the immediate future, scenario, whereby Regulation 45/2001 is not amended in time and all of Europol, Eurojust and EPPO Regulations , when adopted, will supplement and further specify its provisions, which are outdated and unsuitable for the criminal justice and law enforcement area.

Statewatch Analysis:  The Proposed Data Protection Regulation: What has the Council agreed so far?

By Steve Peers, Professor of Law, University of Essex
Twitter: @StevePeers 8 December 2014

Introduction

Back in January 2012, the Commission proposed a new data protection Regulation that would replace the EU’s existing Directive on the subject. It also proposed a new Directive on data protection in the sphere of law enforcement, which would replace the current ‘Framework Decision’ on that subject.
Nearly three years later, there has been some gradual progress on discussing these proposals. The European Parliament (which has joint decision-making power on both proposals) adopted its positions back in the spring. For its part, the EU Council (which consists of Member States’ justice ministers) has been adopting its position on the proposed Regulation in several pieces. It has not yet adopted even part of its position on the proposed Directive.
For the benefit of those interested in the details of these developments, the following analysis presents a consolidated text of the three pieces of the proposed Regulation which the Council has agreed to date, including the parts of the preamble which have already been agreed. I have left intact the footnotes appearing in the agreed texts, which set out Member States’ comments.
The underline, italics and bold text indicate changes from the Commission proposal. I have added a short summary of the subject-matter of the Chapters and Articles in the main text which have not yet been agreed by the Council.
For detailed analyses of some parts of the texts agreed so far, see the links to the two blog posts.
The Council might always change its current position at a later point, and of course the final text of the new legislation will also depend on negotiations between the Council and the European Parliament.

Background documents :‘Public sector’ provisions, agreed by Dec. 2014 JHA Council:
Chapter IV, agreed by Oct. 2014 JHA Council: Rules on territorial scope, agreed by June 2014 JHA Council: Proposal from Commission:Position of European Parliament: Analysis of agreed territorial scope rules: Analysis of agreed ‘privacy seals’ rules:

SEE THE FULL CONSOLIDATED TEXT ANALYSIS (85 pages) HERE

Towards a Declaration of Internet Rights

by Professor Stefano RODOTA’ (FREE Group member) (*)

For many years there has been a wide discussion about the possibility of adopting an Internet Bill of Rights, and debates have produced a considerable number of proposals. The Berkman Centre at Harvard University counted 87 of such proposals, to which we can add the Internet Magna Charta that Tim Berners-Lee is working on, and lastly the Declaration of the Rights of Internet Rights that has been drafted by a Committee established by the President of the Italian Chamber of Deputies. The novelty of the latter is that for the first time the proposal of an Internet Bill of Rights has not been produced by scholars, associations, dynamic coalitions, enterprises, or groups of stakeholders, but by an institutional entity.

It is necessary to recall that the debate on this topic dates back to the World Summit on the Information Society organised in 2005 by the UN in Tunis, where the need for an International Convention on Internet rights was explicitly underlined. This subject was deepened in the following UN Internet Governance Forums. But the international debate was progressively turned into precise rules within the European Union, even before the issue of the Internet Bill of Rights appeared in the international arena. These are not, however, parallel situationsdestined not to meet at any point. The European Union progressively brought to light the constitutional basis of the protection of personal data, finding its full recognition in Article 8 (**) of its Charter of Fundamental Rights. Here a strong similarity with the Internet Bill of Rights is identified, and it concerns precisely the constitutional scope of rules.

We are going through a phase of deep change in the way in which we are facing the problems highlighted by the Internet dynamics, in the passage from Web 1.0 to Web 2.0 and now to Web 3.0. It is not just a matter of following technological changes by adjusting legal provisions to suit them. A new definition is being developed of the rationale driving actions in this area, through a radical U-turn as regards the dynamics of the latest phase. A possible historical turning point is ahead of us, whose/that’s opportunities must be seized.

It seemed that an approach had become consolidated, which left little room to rights. From Scott McNealy’s abrupt statement of 1999 – “You have zero privacy. Get over it” – up to the recent hasty conclusion by Mark Zuckerberg about the end of privacy as a “social rule”, a line characterised by the intertwining of two elements emerged: technological irresistibility and the primacy of the economic logic. On the one side, in fact, it was highlighted how technological innovations and the new social practices made it increasingly difficult, not to say impossible, the safeguard of one’s private life and of the public liberties; on the other side, the statement on the “death of privacy” had become the argument to state that personal information had to be considered as property of those who collected it.

These certainties were radically challenged by Edward Snowden’s disclosure on the magnitude of the National Security Agency’s Prism programme and by the judgements of the European Court of Justice on data retention and Google. The idea according to which the protection of fundamental rights shall give way to the interests of security agencies and enterprises was rejected.

A new hierarchy has been established, with the fundamental rights as the first and starting point. The US President had to admit the inadmissibility of the procedures provided for by the Prism program and the Court of Justice, with its decision of 8th April, that declared that the Directive on data retention was illegal. And in the Google case the same Court explicitly stated that “the fundamental rights under Articles 7 and 8 of the Charter (…) override, as a norm (…) the economic interest of the operator of the search engine”, in a perspective broadening the European Union’s jurisdiction beyond its borders.

We are faced with a true “resurrection of privacy” and, more generally, with the primacy of the need and legitimacy of rules effectively protecting the rights of Internet users. Making reference to article 8 of the Charter, the Court of Justice was acting as a true constitutional court, opening a new and wide perspective.

The Italian initiative

This is the framework within which the Italian initiative on the Declaration of Internet Rights was adopted. Its goal is not limited to having a text to be used for national debate only.

The establishment of the Committee that drafted the document, in fact, was preceded by an international conference gathering some of the authors of the Brazilian Marco Civil, the representatives of European Institutions, and several experts from different Countries.

The text drafted by the Committee was presented on 13th October during a meeting at the Chamber of Deputies with the Presidents of the Parliamentary Committees of Member Countries in charge of fundamental rights.

The present draft is now submitted to a four-month public consultation on the Internet, at the end of which the Committee will draft the final text. Such consultation, however, is also being carried out at a European and international level, as shown by the contacts with other European Parliaments and by the video conference that will be held at the beginning of December between the Italian and the French Committees. Consultations are also taking place with experts and associations from non-European Countries.

An ambitious target was set: drafting a text allowing a common international debate, accompanied by a constant monitoring by the Chamber of Deputies. The goal is not limited to working in the complex and remote perspective of an international convention. Short-term and feasible results can be achieved, concerning the strengthening of the European system, its developments and the relationships with other countries, and most of all the consolidation of a culture highlighting common dynamics in the different legal systems. In this way, the debate around a future Internet Bill of Rights may lead to the awareness that in the different legal systems several elements already exist that, once connected to one another, establish an informal Internet Bill of Rights. An evidence of such trend is found in the decisions of the Courts of the different Countries and in the choice of legislative models, as shown by the clear influence of the European model on the Brazilian Marco Civil.

The Italian Declaration is characterised by a fundamental choice. Differently from almost all the other ones, it does not contain a specific and detailed wording of the different principles and rights already stated by international documents and national Constitutions. Of course, these are generally recalled as an unavoidable reference. But the attempt of the Declaration, as a matter of fact, was to identify the specific principles and rights of the digital world, by underlining not only their peculiarities but also the way in which they generally contribute to redefining the entire sphere of rights.

The key words – besides the most well-known ones concerning the protection of personal data and the right to the informational self-determination – include access, neutrality, integrity and inviolability of IT systems and domains, mass surveillance, development of digital identity, rights and guarantees of people in Internet platforms, anonymity and right to be forgotten, interoperability, right to knowledge and education, and control over Internet governance. The importance of the needs linked to security and to the market is obviously taken into consideration, but the balancing of these interests with fundamental rights and freedoms cannot take place on equal terms, in the sense of ensuring first and foremost the full respect for rights and freedom according to the clear provisions of the Charter of Fundamental Rights and to European case law.

In particular, security needs shall not determine the establishment of a society of surveillance, control and social sorting. Economic needs are taken into consideration in the framework of the neutrality principle that, by guaranteeing the generative nature of the Internet, keeps the possibilities for innovation unchanged, and prevents strong subjects from creating conditions of exclusion of possible competitors. Furthermore, whenever Internet platforms provide public services that are essential for the life and the activities of people, it is necessary to guarantee the conditions for a suitable interoperability in compliance with the principle of competition and equal treatment of people.

Provided that not all the issues can be analysed in this document, it is suitable recalling the need to consider the access to the Internet as a fundamental right of individuals (Tim Berners-Lee compared it to the access to water), as an essential guarantee not only against any form of censorship, but also against indirect limitations, such as taxation as it is presently happening in Hungary. The set of rights recognised do not guarantee a general freedom on the Internet, but specifically aims at preventing the dependency of people from the outside, the expropriation of the right to freely develop one’s personality and identity as it may happen with the wide and increasing use of algorithms and probabilistic techniques. The autonomy in the management of personal data, therefore, shall also consider new rights as those not to be tracked and to keep silent the chip. This perspective requires a particular in-depth analysis, since a deeply interconnected society is being developed, with a passage to Internet of Things in forms that have suggested some people to speak of an Internet of Everything, which determines a digitalisation of day-to-day lives that is able to transform any person and their bodies.

People cannot be reduced to objects of external powers, they must recover the sovereignty on their digital person. Identity is a key issue. The free development of one’s personality must be safeguarded.

Starting from this set of references, it is necessary to thoroughly examine the issue of the transformation of copyright, whose analysis was postponed to the end of the consultation, since knowledge on the Internet appears as a shared asset that can be considered as a common global resource.

A broader perspective is therefore opened by the Italian draft Declaration, in consideration of the large amount of topics to be tackled and the debate between different points of view; and such Declaration is significantly in line with the European Union policy that particularly emphasises the Charter of Fundamental Rights. The unquestionable aspect is the need to fine-tune a constitutional policy for the Internet, whose users – presently amounting to three billion people – cannot rely on a freedom guaranteed by the absence of rules, as it is still presently stated.

The reality is very different, showing an interconnected network heavily regulated by private subjects that cannot be controlled and that have no democratic legitimation, as it happens – beyond any disputes – with the “Over the Top” operating on the Internet. Internet rights are denied by totalitarian regimes and, unfortunately, by democratic regimes as well. The perspective of a Declaration of Internet rights aims at developing – through procedures different from the ones of the past – the constitutional rules that are fundamental in order to allow the Internet to keep its main feature as a place of freedom and democracy, as the widest space of the history of mankind.

NOTE

(*) Intervention at the Friedrich-Ebert-StiftungFREE Group experts meeting on :
Internet: only a “single digital market” or also a space to promote fundamental rights – Towards a European “Marco Civil”? (November 12, 2014). The main idea of this experts’ conference has been to have a first look to the impact of the EU Digital Agenda on fundamental rights as framed by the Treaties, the EU Charter and the recent CJEU jurisprudence (Data retention, Google Case..). As stated by the Charter the individual should be at the center of all EU policies and this objective underpins the recent proposal for an Internet Bill Of Rights of the Italian Chamber of Deputies as well as other national examples (Brasilian “Marco Civil” and recent US initiatives at government, congress and civil society level).
Bearing in mind that EU is competent on most of the aspects dealing with Internet the question arises how to preserve and promote individual rights notably in the pending negotiations on legislative proposals notably on Data Protection, Net Neutrality and Network Security (NIS). Moreover what should be the future initiatives to be developed by the a new Commission’s legislative programme impacting on Internet ? How the future EU single digital market could preserve the principles of non-discrimination, and of informational self determination by strengthening the access to internet as a public common good ?
Together with Stefano Rodotà took also part to the Seminar
Claude Moraes Chairman of the European Parliament Civil liberties Committee (which adopted in 2009 a first Internet Bill of Rights resolution)
Jan Philipp Albrecht EP Rapporteur for the Data Protection Regulaiton and for the transatlantic “umbrella” Agreement
Paul Nemitz Director at the European Commission
Giovanni Buttarelli, Assistant European Data Protection Supervisor
Marc ROTENBERG Professor at the Georgetown University and Director of EPIC and Marie GEORGES expert at the Council of Europe
as well as Joe Mc Namee, Executive Director, European Digital Rights (EDRi).

(**) Article 8 Protection of personal data
1. Everyone has the right to the protection of personal data concerning him or her.
2. Such data must be processed fairly for specified purposes and on the basis of the consent of the person concerned or some other legitimate basis laid down by law. Everyone has the right of access to data which has been collected concerning him or her, and the right to have it rectified.
3. Compliance with these rules shall be subject to control by an independent authority

 

 

THE PROPOSED GENERAL DATA PROTECTION REGULATION: SUGGESTED AMENDMENTS TO THE DEFINITION OF PERSONAL DATA

by Douwe Korff, Professor of International Law

(and FREE Group Member)

  1. Background

In a recent judgment (discussed previously on this blog) the third chamber of the CJEU has ruled that the concept of “personal data” in the 1995 data protection (DP) directive is limited to data directly relating to a person, and does not include legal analyses in the file on the person, on which the state (NL) relied in taking its decisions in relation to that person (Joined Cases C-141/12 and C-372/12). I believe the Court’s restriction of the concept is wrong and contrary to the intended purpose of data protection; and should be corrected in the new General Data Protection Regulation.

First of all, the Court based itself on the, in my opinion erroneous, view that the 1995 EC DP Directive was solely aimed at protecting privacy. In particular, it felt that the right of data subjects to access to their personal data should not extend to a legal analysis of their case, contained in a file on them, because (in the Court’s view) such an analyses “is not in itself liable to be the subject of a check of its accuracy by [a data subject]”, and data subjects should not be able to use data protection to seek a rectification of such an analysis (cf. para. 44 of the judgment).

Secondly, the Court also relied on the fact that data of the kind at issue in the joined cases was administrative data held by a public authority and, drawing a parallel with EU regulations on privacy and access to documents, held that access to the legal analysis should be addressed under the latter rules rather than the former. This failed to take into account the fact that the EU rules referred to apply only to public (i.e., EU) bodies, whereas the 1995 DP Directive applies also, and in indeed especially, to private-sector bodies (in particular companies) that are not subject to public-sector rules on access to administrative data.

The Court’s judgment, in sum, seriously limits the concept of personal data and the right of access to one’s personal data, and thus seriously limits the application of the entire EU data protection regime. It leaves individuals with seriously less rights in respect of data on them (or relating to them, or used to take decisions on them, or that affect them) than was previously thought.

Specifically,the judgment runs directly counter to the authoritative 2007 Article 29 Working Party (WP) Opinion on the concept of personal data (Opinion 4/2007, WP136, of 20 June 2007). This first of all noted that the purpose of data protection is not limited to a narrow concept of privacy – as is indeed also clear from the fact that data protection is guaranteed in the Charter of Fundamental Rights (CFR) as a separate right, sui generis, from the right to private life/privacy (data protection is guaranteed in Article 8 CFR; Privacy in Article 7 CFR). Astonishingly, given that the WP29 is expressly charged with providing guidance on the interpretation and application of the 1995 DP Directive, the Court did not even mention either the Working Party or this specific opinion.

In the opinion, the Working Party discussed four elements of the definition, from which it deduces the appropriate criteria for determining whether data should be regarded as personal data within the meaning of the directive. They can be paraphrased as follows:

The first element: “any information”:

The WP concludes that these words indicate that the concept of personal data should be interpreted broadly, and not limited to matters relating to a person’s private and family life stricto senso (as has wrongly been done in the UK under the Durant decision, and as appears to also underpin the Court’s judgment). It also covers information in any form, including documents, photographs, videos, audio and biometric data, body tissues and DNA.

The second element: “relating to”:

In general terms, information can be considered to “relate” to an individual when it is about that individual. However, data about “things” can also be personal data, if the object in question is closely associated with a specific individual (e.g., mobile phone location data). This is of increasing importance in the era of the Internet of Things. Important in relation to the CJEU judgment, the WP29 adds the following consideration, with reference to an earlier opinion, on radio frequency identification (RFID) tags, WP105 of 19 January 2005 (original italics and bold; underlining added):

In the context of discussions on the data protection issues raised by RFID tags, the Working Party noted that “data relates to an individual if it refers to the identity, characteristics or behaviour of an individual or if such information is used to determine or influence the way in which that person is treated or evaluated.“…

[I]n order to consider that the data “relate” to an individual, a “content” element OR a “purpose” element OR a “result” element should be present.

The “content” element is present in those cases where – corresponding to the most obvious and common understanding in a society of the word “relate” – information is given about a particular person, regardless of any purpose on the side of the data controller or of a third party, or the impact of that information on the data subject. (…)

Also a “purpose” element can be responsible for the fact that information “relates” to a certain person. That “purpose” element can be considered to exist when the data are used or are likely to be used, taking into account all the circumstances surrounding the precise case, with the purpose to evaluate, treat in a certain way or influence the status or behaviour of an individual. (…)

A third kind of ‘relating’ to specific persons arises when a “result” element is present. Despite the absence of a “content” or “purpose” element, data can be considered to “relate” to an individual because their use is likely to have an impact on a certain person’s rights and interests, taking into account all the circumstances surrounding the precise case. It should be noted that it is not necessary that the potential result be a major impact. It is sufficient if the individual may be treated differently from other persons as a result of the processing of such data.

These three elements (content, purpose, result) must be considered as alternative conditions, and not as cumulative ones. In particular, where the content element is present, there is no need for the other elements to be present to consider that the information relates to the individual. A corollary of this is that the same piece of information may relate to different individuals at the same time, depending on what element is present with regard to each one. The same information may relate to individual Titius because of the “content” element (the data is clearly about Titius), AND to Gaius because of the “purpose” element (it will be used in order to treat Gaius in a certain way) AND to Sempronius because of the “result” element (it is likely to have an impact on the rights and interests of Sempronius). This means also that it is not necessary that the data “focuses” on someone in order to consider that it relates to him. …

The “legal analyses” that the CJEU ruled were not personal data are clearly covered by the above: they are the very basis on which the data subjects in questions (asylum seekers) were “treated” and “evaluated”. To apply the reasoning of the Working Party: they determine whether Titius should be treated the same way as Gaius or not; and they may also have an impact on the rights and interests of Sempronius.

This is also crucially important in relation to “profiles”. Under the judgment, states and companies could argue that individuals should also not have a right to challenge the accuracy of a profile, any more than the accuracy of a legal analysis; and that, indeed, they are not entitled to be provided on demand with the elements used in the creation of a profile. After all, a profile, by definition, is also based on an abstract analysis of facts and assumptions not specifically related to the data subject – although both are of course used in relation to the data subject, and determine the way he or she is treated.

In my opinion, the above is the most dangerous limitation flowing from the Court’s judgment.

The third element: “identified or identifiable”:

Although this issue did not arise in the CJEU cases, it is still crucial, in particular in relation to the ever-increasing and ever-more-widely-available massive sets of “Big Data”. In the opinion of the WP, the core issue is whether a person is, or can be, singled out from the data, whether by name or not. A name sometimes suffices for this, but often not, while a photograph or an identity number often does allow such singling out even if no other details of the person are known. In relation to pseudonymised or supposedly anonymised data, the WP concluded (with reference to the recitals in the 1995 directive) that the central issue is whether the person can be identified (singled out), whether by the data controller or by any other person, “taking account of all the means likely reasonably to be used either by the controller or by any other person to identify that individual.

The fourth element: “natural person”:

In principle, personal data are data relating to identified or identifiable living individuals. There are some issues relating to data on deceased persons and unborn children: these can often still (also) relate to living individuals, in the way discussed above, and would then still be personal data in relation to those latter individuals. Data on legal entities can sometimes also, similarly, relate to living individuals associated with those entities. Also, in some contexts some data protection rights are expressly extended to legal persons (companies etc.) per se, in particular under the so-called “e-Privacy Directive”. But that is a special case. This too, however, was not an issue relevant to the CJEU judgment.

Until the CJEU judgment, it could be assumed that as long as the General Data Protection Regulation used the same definition of personal data as the 1995 DP Directive, the above elements and criteria could simply be read into the new instrument.

However, the judgment could result in the definition in the GDPR being read in accordance with the Court’s restricted views, rather than in line with the WP29 guidance.

In my opinion, if the EU wishes to retain a strong European data protection framework, as is often asserted, it is essential that the GDPR expressly (if of course briefly) endorses the WP29 view of the issue, rather than the CJEU’s one.

Below, I suggest amendments to the definition of the concept of personal data in the GDPR that would achieve that (some further amendments should be made to the recitals).

  1. Proposed amendments to the GDPR

As can be seen from the Annexes, with the different definitions of personal data and data subject in the Commission text of the GDPR and in the amended version of the Regulation adopted by the EP (and with the corresponding definitions in the current 1995 DP Directive), the definitions all say in essence that:

‘personal data’ means any information relating to a data subject (with ‘data subject’ then defined as “an identified or identifiable natural person”), or:

‘personal data’ means any information relating to an identified or identifiable natural person which comes to the same thing (and is in accordance with the current directive).

The EP text adds clarification on when a person can be regarded as “identifiable”, on the lines of the views of the Article 29 Working Party (drawing on a recital in the current directive); and more specific provisions on “pseudonymous data” and “encrypted data”.

However, neither text adds clarification on the question of when data can be said to “relate” to a (natural, living) persons – which is the issue so badly dealt with in the CJEU judgment.

I propose that the definition of “personal data” in the GDPR be expanded to expressly clarify the question of when data can be said to “relate” to a person, by drawing on the guidance of the Article 29 Working Party set out above; and by also expressly clarifying that “profiles” always “relate” to any person to whom they may be applied. Specifically, I propose that an additional paragraph be added to Article 2(2), spelling out that:

“data relate to a person if they are about that person, or about an object linked to that person; or if the data are used or are likely to be used for the purpose of evaluating that person, or to treat that person in a certain way or influence the status or behaviour of that person; or if the use of the data is likely to have an impact on that person’s rights and interests. Profiles resulting from ‘profiling’ as defined in [Article 20 in the Commission text/Article 4(3a) of the EP text] by their nature relate to any person to whom they may be applied.”

The Annexes indicate more specifically how such an amendment could be incorporated into the current (Commission and EP) texts of the Regulation.

Annex I

PROPOSED AMENDMENTS TO ARTICLE 4 OF THE GENERAL DATA PROTECTION REGULATION:

(Added or amended text in bold)

The proposed amendments if applied to the Commission text:

(1)        ‘data subject’ means an identified natural person or a natural person who can be identified, directly or indirectly, by means reasonably likely to be used by the controller or by any other natural or legal person, in particular by reference to an identification number, location data, online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that person;

(2)        ‘personal data’ means any information relating to a data subject;

(2a)      data relate to a person if they are about that person, or about an object linked to that person; or if the data are used or are likely to be used for the purpose of evaluating that person, or to treat that person in a certain way or influence the status or behaviour of that person; or if the use of the data is likely to have an impact on that person’s rights and interests. Profiles resulting from ‘profiling’ as defined in Article 20 by their nature relate to any person to whom they may be applied.

The proposed amendments if applied to the EP text:

(2)        ‘personal data’ means any information relating to an identified or identifiable natural person (‘data subject’);

(2a)      an identifiable person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, unique identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social or gender identity of that person;

(2b)     data relate to a person if they are about that person, or about an object linked to that person; or if the data are used or are likely to be used for the purpose of evaluating that person, or to treat that person in a certain way or influence the status or behaviour of that person; or if the use of the data is likely to have an impact on that person’s rights and interests. Profiles resulting from ‘profiling’ as defined in paragraph (3a) by their nature relate to any person to whom they may be applied.

(2c) ‘pseudonymous data’ means personal data that cannot be attributed to a specific data subject without the use of additional information, as long as such additional information is kept separately and subject to technical and organisational measures to ensure non-attribution;

(2d) ‘encrypted data’ means personal data, which through technological protection measures is rendered unintelligible to any person who is not authorised to access it;

NB: The actual Commission and EP texts are set out in Annex II

Annex II

The definition of “personal data” in the original Commission text of the GDPR and in the amended version of the Regulation adopted by the European Parliament:

Text proposed by the Commission Amendment
Definitions Definitions
For the purposes of this Regulation: For the purposes of this Regulation:
(1) ‘data subject’ means an identified natural person or a natural person who can be identified, directly or indirectly, by means reasonably likely to be used by the controller or by any other natural or legal person, in particular by reference to an identification number, location data, online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that person;
(2) ‘personal data’ means any information relating to a data subject; (2) ‘personal data’ means any information relating to an identified or identifiable natural person (‘data subject‘); an identifiable person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, unique identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social or gender identity of that person;
(2a) ‘pseudonymous data’ means personal data that cannot be attributed to a specific data subject without the use of additional information, as long as such additional information is kept separately and subject to technical and organisational measures to ensure non-attribution;
(2b) ‘encrypted data’ means personal data, which through technological protection measures is rendered unintelligible to any person who is not authorised to access it;

Cf. the following definition in the current 1995 DP Directive:

(a) ‘personal data ‘shall mean any information relating to an identified or identifiable natural person (‘data subject’); an identifiable person is one who can be identified, directly or indirectly, in particular by reference to an identification number or to one or more factors specific to his physical, physiological, mental, economic, cultural or social identity;

WARNING: THE EU COUNCIL IS TRYING TO UNDERMINE PRIVACY SEALS (and through this, the General Data Protection Regulation)

by Douwe KORFF (*)

(*) Professor Douwe Korff is an Associate of the Oxford Martin School of the University of Oxford and a Visiting Fellow at Yale University (Information Society Project). He helped to establish the European Privacy Seal (EuroPriSe) scheme discussed in the text.

  1. Introduction

Some people, including myself, believe that good privacy seals, managed by the right bodies, can make a serious contribution to high-level data protection – while bad seals, issued by bodies that are more interested in providing fig-leaves and making money, can seriously harm data protection. The arrangements for data protection certification in the new General Data Protection Regulation (hereafter: “the regulation”) are therefore important. The original draft of the regulation, issued by the Commission in January 2012, merely said that certification schemes should be “encouraged” (although it provided for some EU-level harmonisation of the frameworks).

The European Parliament’s amended text is much more ambitious in this regard and, if adopted, would make certification schemes both more integrated with the general data protection regime and stronger, also in terms of ensuring that no seals could be issued in one Member State that would undermine data protection in other Member States.

However, the text set out in an EU Council document dated 26 September 2014 and just leaked, shows that the Member States are trying to undermine the good proposals of Parliament.

At II, I first briefly set out the problems with European privacy seal schemes under the current rules. Next, at III, I analyse the relevant provisions in the different versions of the regulation, adopted by the Commission, Parliament and the Council. Finally, at IV, I conclude that if the Council text were to be adopted, the provisions on seals could become a Trojan Horse that could seriously undermine the in principle strong data protection regime in the regulation (pace other watering-down attempts by the Council). This note thus seeks to sound a warning to those involved in the upcoming trilateral negotiations on the regulation text, not to allow such a dangerous scheme (or rather, an ill-defined miscellany of schemes) to slip in.

  1. Data protection seals and the 1995 Data Protection Directive

There is no explicit provision on data protection- or privacy seals or certification schemes in the main EC data protection directive (Directive 95/46/EC, hereafter “the directive”), although other self-regulatory mechanisms, such as codes of conduct and contractual arrangements are encouraged under it (see Art. 27 re codes; Art. 26(2) re “appropriate contractual clauses”). Nevertheless, the European Commission has in practice encouraged the establishment of seals, in particular by supporting the establishment of the “European Privacy Seal” (EuroPriSe) scheme under an “e-TEN” programme; this was until recently operated by the data protection authority of the German Land of Schleswig-Holstein, the Independent Centre for Privacy Protection (or ULD after its German initials), but has recently been passed on to a private German company, 2B.[1] The French data protection authority, CNIL, has also established a certification scheme, under which controllers can certify that they meet certain CNIL-specified criteria (but so far only in relation to privacy training, data protection audit, and one product: cloud computing).[2]

Continue reading “WARNING: THE EU COUNCIL IS TRYING TO UNDERMINE PRIVACY SEALS (and through this, the General Data Protection Regulation)”

Future of EU migration, home and justice policies. Some questions to the new candidates commissioners..

by Steve PEERS, Henri LABAYLE and Emilio DE CAPITANI

The would-be Commissioners for immigration and home affairs and Justice will shortly be questioned by Members of the European Parliament (MEPs) in hearings, to determine whether the EP should vote to confirm them in office. MEPs have already asked some written questions and the would-be Commissioners have replied. Since most of the written questions were not very searching (except for a couple of questions on data protection issues), the Commissioners did not reply in much detail. However, the hearings are an opportunity for MEPs to ascertain the Commissioners’ plans, and to secure important political commitments, in these fields. To that end, we have therefore suggested a number of oral questions which MEPs should ask in the hearings.

Immigration and asylum

The Commission consider that migration policy should be framed by the (non binding) objectives of the global approach to migration (GAMM) and relations with third countries should be dealt with by “Mobility Partnership” which are more diplomatic declarations than binding acts. Would you propose a binding legal basis for treaties with the countries concerned, grounded on Articles 77, 78 and 79 of the TFEU?

What actions will the Commission take to ensure that EU legislation in this field is fully and correctly implemented by the Member States?

Will the Commission propose an immediate amendment to the EU visa code, to confirm that Member States are obliged to give humanitarian visas to those who need them and who apply at Member States’ consulates in third countries?

When will the Commission propose EU legislation to guarantee mutual recognition of Member States’ decisions regarding international protection, including the transfer of protection?

When will the Commission make proposals for a framework for sharing responsibility for asylum-seekers and persons who have been granted international protection, starting with those who have applied outside the territory of the Member States?

Will the Commission propose an immigration code, and what will its main contents be?

The Court of Justice has recognised that search and rescue obligations are interlinked with external borders surveillance (Case C-355/10). The EU adopted rules in this field which governing only border control coordinated by Frontex. Do you intend to propose that such rules should apply to all Member States’ border controls as a general rule, by formally amending the Schengen Borders Code ?

What immediate and longer-term steps will the Commission take to address the death toll of migrants crossing the Mediterranean?

Will the Commission propose to amend the EU legislation on facilitation of unauthorised entry to confirm that anyone who saves migrants from death or injury during a border crossing, or who otherwise acts from humanitarian motives, is exempt from prosecution?

Internal Security and Police cooperation Continue reading “Future of EU migration, home and justice policies. Some questions to the new candidates commissioners..”

PEERS : Data protection rights and administrative proceedings

ORIGINAL PUBLISHED ON EU LAW ANALYSIS
Thursday, 17 July 2014

Steve Peers

What rights do asylum-seekers have as regards data protection law? This issue was clarified in today’s CJEU judgment in YS and M and S, which could also have broader relevance for any case which involves access to documents in the context of administrative procedures.

Both cases involved asylum-seekers in the Netherlands, who sought access to file notes concerning their case. However, they did not rely on the EU’s asylum procedures Directive, which states that asylum-seekers must be given the reasons for negative decisions and are entitled to access reports about the interviews held with them, but does not make mention of access to any other document. The second-phase procedures Directive, applicable to applications made after 20 July 2015, adds a right of access to country-of-origin information and expert advice which was used in making a decision on the asylum-seeker’s case, but still does not extend to a right to the entire file.

So they invoked the data protection Directive instead. The first question in this respect was whether the legal analysis in the file concerning their case was ‘personal data’ within the meaning of the Directive. According to the CJEU, it was not, for although that analysis ‘may contain personal data, it does not in itself constitute such data within the meaning of’ that Directive. That analysis ‘is not information relating to the applicant for a residence permit, but’ rather ‘information about the assessment and application by the competent authority of that law to the applicant’s situation’, based on the personal data available to the authorities.

The Court further opined that this was consistent with the purpose of the Directive, which was to ensure the right to privacy, including the check on the accuracy of the data and the correction of inaccurate data. A different approach would amount to ‘the right of access to administrative documents’, which was not the point of the Directive. It justified its analysis by analogy with the Bavarian Lager judgment, in which it had ruled that the Directive did not have the purpose of opening up the transparency of EU decision-making.

The second point was the extent of access to the personal data (as defined by the Court) which was being processed. On this point, the CJEU rejected the argument that the entire file document had to be made available, and instead stated that it was sufficient to give data subjects an intelligible summary of the personal data being processed.

Finally, the national court had asked about the possible application of Article 41 of the Charter, which sets out the right to good administration. The CJEU distinguished its prior case law, and asserted that this Charter right applied only to EU bodies, not to national administrations. But the right to good administration could still be invoked against national authorities as a general principle, as distinct from a Charter right.

Comments

The Court’s analysis of the main data protection issues here is not very convincing. There is nothing in the text of either the data protection Directive or the asylum procedures Directive that would suggest a distinction between administrative documents which contain personal data, and other types of collection of personal data. Quite clearly asylum-seekers do have an interest in knowing how their personal data is being processed in respect of an analysis of their application, and of correcting that personal data if it is correct.

To argue that the data protection Directive does not give access to administrative documents is a straw man argument. The question is not whether it aims to give access to all administrative documents, but only whether it gives access to those which contain personal data. The comparison with the Court’s Bavarian Lager judgment makes no sense either, for in that case data protection formed an express exception to the EU legislation on access to documents, and the two rights were in conflict.

The Court’s judgment on the second point is more convincing, in light of the wording of the data protection Directive, which only requires an intelligible summary of the personal data being processed to be made available.

Finally, the Court’s analysis of Article 41 of the Charter is a brave attempt to clear up the prior inconsistencies and confusion on this point, for instance in its recent judgment on procedural rights as regards subsidiary protection applications. Undeniably the Charter provision does only apply to EU bodies, not to Member States, but the Court nevertheless guarantees that the right to good administration can be claimed against the latter by clarifying that the right to good administration is nonetheless a general principle of EU law.

This is, apparently, the first time that the Court has confirmed that some rights are not in the Charter, but are protected as general principles of EU law. This raises important questions as to which other rights might be protected in that way, what the difference between the parallel rights to good administration might be, and whether the general principles have a different legal effect than Charter rights. But in the specific context of asylum proceedings, and more generally in many other areas of EU law, it is useful that the Court confirmed that applicants can still enforce (by a different means) the right to good administration against national authorities.

Europe v Facebook: the beginning of the end for NSA spying on EU citizens?

Original published on EU LAW ANALYSIS
Wednesday, 18 June 2014

by Steve Peers

Since the revelations about the extent of spying by the American National Security Agency (NSA) revealed by Edward Snowden, doubts have increased about the adequacy of the data protection regime in the United States, in particular as regards its impact on EU citizens, who are subject to the more favourable regime established by the Data Protection Directive. One aspect of these doubts concerns the ability of the NSA to examine the content of communications processed by social media companies based in the USA, such as Facebook.

Today’s decision by the Irish High Court to send questions in the ‘Europe v Facebook’ case to the CJEU raises the possibility that the NSA’s access to EU citizens’ personal data might soon come to an end. But it’s not clear if the CJEU will address the most essential issues directly, because the case raises a number of complex legal issues that need to be examined in more detail.

As a starting point, the basic legal regime governing transfers to Facebook is the ‘Safe Harbour’ system, which takes the form of a Commission Decision finding that all American companies certifying their participation in a system for complying with basic data protection principles maintain an ‘adequate’ level of data protection. This is one of the ‘adequacy decisions’ that the Commission can make pursuant to the rules on the data protection Directive on transfers of personal data outside the EU (see further my recent blog post on the planned reforms to this system). Despite the doubts arising from the Snowden revelations, the Commission’s most recent report on the Safe Harbour system did not suggest that the system should be abandonned

Not everyone accepts these assertions, however. An Austrian citizen, Mr. Schrems, complained about the transfer of his personal data as a Facebook user pursuant to the Safe Harbour rules to the Irish data protection authority, which was competent in this matter because Facebook has a subsidiary in Ireland. The national authority argued that it could not take a decision on this complaint, because it was bound by the Commission’s decision. Moreover, it argued that the complaint was ‘frivolous’.

Mr. Schrems then challenged the authority’s decision before the Irish High Court. In its ruling today, the national judge therefore decided to send a question to the CJEU. Essentially, the question is whether the national data protection authority is bound by the Commission’s Decision, and whether that authority can conduct its own examination.

The first obvious question in this case is whether the American system infringes EU data protection law. Basing itself on the recent Digital Rights judgment of the CJEU, in which that Court ruled that the EU’s data retention Directive was invalid, the national court clearly believes that it does. While acknowledging the important anti-terrorist objectives of the law, the judge, when examining national constitutional law states that it is ‘very difficult’ to see how such mass surveillance ‘could pass any proportionality test or survive any constitutional scrutiny’. Indeed, such surveillance has ‘gloomy echoes’ of the mass surveillance carried out in ‘totalitarian states such as the [East Germany] of Ulbricht and Honeker’.

The judge equally believes that the US system is a violation of EU law, with no adequate or accessible safeguards available to EU citizens, and no consideration of EU law issues built in to the review process that does exist.

Is this analysis correct? There are two fundamental issues here which the national court doesn’t consider: the scope of the data protection directive, and the derogations from that Directive. On the question of scope, the CJEU previously found in its Passenger Name Records (PNR) judgment that the EU/US agreement which provided for the transfer of data from airlines to the US authorities was outside the scope of the data protection Directive, because it regulated essentially only the activities of law enforcement authorities, and the Directive does not apply to the ‘processing of personal data…in the course of an activity which falls outside the scope’ of EU law, such as…public security, defence, State security…and…criminal law’. On the other hand, the CJEU ruled that the data retention directive was correctly based on the EU’s internal market powers, since it essentially regulated the activity of private industry, albeit for public security objectives. While in this case, it might be argued that the American law in question falls within the first type of law, the Safe Harbour agreement clearly falls within the second. So it is a sort of hybrid question, but on balance the issue falls within the scope of the Directive, since the measure at issue is essentially the Safe Harbour agreement.

Secondly, the external transfer rules in the EU Directive do not refer expressly to the issue of derogations from data protection rights on public security grounds. Yet presumably some such derogations can exist, given that the Directive itself provides for public security derogations as regards the standard EU rules. Surely the security exceptions applied by third countries don’t have to be exactly the same as those applied by the Directive. But some form of minimum standard must apply. For the reasons set out by the national judge, however, there is a strong argument that the US rules fall below the standard of anything which the EU can accept as ‘adequate’.

Because the national judge takes these two issues for granted, there is no question sent to the CJEU on whether the American regime is either within the scope of the Directive, or violates the minimum standards of adequacy which the EU can accept as regards third states. But both these issues are absolutely essential in the debate over the post-Snowden relationship between the US and EU. It would therefore be desirable if the CJEU addressed them nonetheless.

Next, another problematic issue here is which set of EU data protection rules should apply: the external transfer rules, or the more stringent standard rules? The national court, along with the data protection authority, applies the external transfer rules, given Facebook’s certification under the Safe Harbour system. However, it is doubtful whether this is correct.

As is well known, in the recent Google Spain judgment, the CJEU ruled that the standard rules applied to Google’s search engine function, given that it had an ‘establishment’ in Spain, according to the Court’s interpretation of the rules. As I then argued on this blog, it probably follows from that judgment that the standard rules apply at least to some social networks like Facebook. In any event, the issue will arise again when the revised jurisdiction and external transfer rules, mentioned above, apply. However, the complainant and the national court assume that the external transfer rules apply. Perhaps the CJEU should also examine this issue of its own motion.

Another problematic issue is the question of how to challenge the inadequacy of data protection in practice in the US, which is the subject of the only question sent to the CJEU. The Safe Harbour agreement addresses this point directly, since it allows national data protection authorities to suspend data transfers as regards an individual company, in accordance with existing national law, if either the US government or the US enforcement system has found a violation of that agreement, or if:

there is a substantial likelihood that the Principles are being violated; there is a reasonable basis for believing that the enforcement mechanism concerned is not taking or will not take adequate and timely steps to settle the case at issue; the continuing transfer would create an imminent risk of grave harm to data subjects; and the competent authorities in the Member State have made reasonable efforts under the circumstances to provide the organisation with notice and an opportunity to respond.

However, Irish national law does not provide for such a system, but simply sets out an irrebutable presumption that the Commission’s adequacy decision is sufficient. This rule may well have played a part in convincing Facebook and the subsidiaries of other US companies to set up in Ireland in the first place.
The challenge argued that the national data protection authority nevertheless had to exercise such powers, and so the national judge asked only whether this was possible. Logically, there can be only one answer, by extension from the NS judgment: Member States cannot create an irrebutable presumption that prevents the exercise of Charter rights, so the national data protection authority must have the powers in question.

In the alternative, or arguably additionally, it must be possible to challenge the validity of the Commission’s adequacy decision in the national courts, which would then have an obligation, if they thought that challenge was well-founded, to send questions on that point to the CJEU. (See the Foto-Frost judgment).

The next problematic issue is the role of the national constitutional protection for human rights. Clearly the national judge believes that the American system breaches the protection for the right to privacy guaranteed in the Irish constitution. Nevertheless, the national court proceeds to examine the issue primarily from the perspective of EU law. So if the CJEU rules against the challenge to the American law on the merits, or does not address those merits for procedural reasons, should the national court proceed to apply Irish law?

In principle, national constitutional law cannot apply here, since EU law, as the national court recognises, has extensively harmonised this issue. This means that, according to the Melloni judgment of the CJEU, only the EU’s human rights standards, in the form of the Charter, can apply. National constitutional standards cannot. But national courts in Ireland (and elsewhere) might be unwilling to accept that outcome.

National law would only apply if the CJEU rules that this issue falls entirely outside the scope of the Directive, as discussed above. If, on the other hand, the processing falls within a public security derogation from the Directive, the EU Charter would apply, by analogy with the CJEU’s recent judgment in Pfleger (discussed here), in which it ruled that the Charter applies to national derogations from EU free movement law. This parallels the argument (discussed here) that national data retention law falls within the scope of EU law, following the Digital Rights judgment, because it is a derogation from the EU’s e-privacy Directive.

Finally, the consequences of any future finding by the national data protection authority that transfers under the Safe Harbour decision must be suspended as regards Facebook must be considered. Assuming that the US had not changed its law in the meantime, Facebook would have a dilemma: should it comply with its US legal obligations, or face the suspension of transfers of data from Europe? Possibly it could avoid this dilemma by ensuring that it only processed EU residents’ data within the EU, potentially avoiding the scope of US law. But this might be expensive, and in any event the US might seek to extend the scope of its law to cover such cases. These issues would inevitably arise for other major US companies as well.

Any real prospect that Facebook transfers from the EU might be blocked would cause a major earthquake in EU/US relations, making the concerns about the recent Google Spain judgment look like a minor tremor. It may be that the only solution is for the US to take more seriously its ongoing discussions with the EU on data protection issues, with a view to reaching a solution that reconciles its security concerns with the basic principles of privacy protection.

Europe v Facebook: the beginning of the end for NSA spying on EU citizens?

Original published on EU LAW ANALYSIS
Wednesday, 18 June 2014

by Steve Peers

Since the revelations about the extent of spying by the American National Security Agency (NSA) revealed by Edward Snowden, doubts have increased about the adequacy of the data protection regime in the United States, in particular as regards its impact on EU citizens, who are subject to the more favourable regime established by the Data Protection Directive. One aspect of these doubts concerns the ability of the NSA to examine the content of communications processed by social media companies based in the USA, such as Facebook.

Today’s decision by the Irish High Court to send questions in the ‘Europe v Facebook’ case to the CJEU raises the possibility that the NSA’s access to EU citizens’ personal data might soon come to an end. But it’s not clear if the CJEU will address the most essential issues directly, because the case raises a number of complex legal issues that need to be examined in more detail.

As a starting point, the basic legal regime governing transfers to Facebook is the ‘Safe Harbour’ system, which takes the form of a Commission Decision finding that all American companies certifying their participation in a system for complying with basic data protection principles maintain an ‘adequate’ level of data protection. This is one of the ‘adequacy decisions’ that the Commission can make pursuant to the rules on the data protection Directive on transfers of personal data outside the EU (see further my recent blog post on the planned reforms to this system). Despite the doubts arising from the Snowden revelations, the Commission’s most recent report on the Safe Harbour system did not suggest that the system should be abandonned

Not everyone accepts these assertions, however. An Austrian citizen, Mr. Schrems, complained about the transfer of his personal data as a Facebook user pursuant to the Safe Harbour rules to the Irish data protection authority, which was competent in this matter because Facebook has a subsidiary in Ireland. The national authority argued that it could not take a decision on this complaint, because it was bound by the Commission’s decision. Moreover, it argued that the complaint was ‘frivolous’.

Mr. Schrems then challenged the authority’s decision before the Irish High Court. In its ruling today, the national judge therefore decided to send a question to the CJEU. Essentially, the question is whether the national data protection authority is bound by the Commission’s Decision, and whether that authority can conduct its own examination.

The first obvious question in this case is whether the American system infringes EU data protection law. Basing itself on the recent Digital Rights judgment of the CJEU, in which that Court ruled that the EU’s data retention Directive was invalid, the national court clearly believes that it does. While acknowledging the important anti-terrorist objectives of the law, the judge, when examining national constitutional law states that it is ‘very difficult’ to see how such mass surveillance ‘could pass any proportionality test or survive any constitutional scrutiny’. Indeed, such surveillance has ‘gloomy echoes’ of the mass surveillance carried out in ‘totalitarian states such as the [East Germany] of Ulbricht and Honeker’.

The judge equally believes that the US system is a violation of EU law, with no adequate or accessible safeguards available to EU citizens, and no consideration of EU law issues built in to the review process that does exist.

Is this analysis correct? There are two fundamental issues here which the national court doesn’t consider: the scope of the data protection directive, and the derogations from that Directive. On the question of scope, the CJEU previously found in its Passenger Name Records (PNR) judgment that the EU/US agreement which provided for the transfer of data from airlines to the US authorities was outside the scope of the data protection Directive, because it regulated essentially only the activities of law enforcement authorities, and the Directive does not apply to the ‘processing of personal data…in the course of an activity which falls outside the scope’ of EU law, such as…public security, defence, State security…and…criminal law’. On the other hand, the CJEU ruled that the data retention directive was correctly based on the EU’s internal market powers, since it essentially regulated the activity of private industry, albeit for public security objectives. While in this case, it might be argued that the American law in question falls within the first type of law, the Safe Harbour agreement clearly falls within the second. So it is a sort of hybrid question, but on balance the issue falls within the scope of the Directive, since the measure at issue is essentially the Safe Harbour agreement.

Secondly, the external transfer rules in the EU Directive do not refer expressly to the issue of derogations from data protection rights on public security grounds. Yet presumably some such derogations can exist, given that the Directive itself provides for public security derogations as regards the standard EU rules. Surely the security exceptions applied by third countries don’t have to be exactly the same as those applied by the Directive. But some form of minimum standard must apply. For the reasons set out by the national judge, however, there is a strong argument that the US rules fall below the standard of anything which the EU can accept as ‘adequate’.

Because the national judge takes these two issues for granted, there is no question sent to the CJEU on whether the American regime is either within the scope of the Directive, or violates the minimum standards of adequacy which the EU can accept as regards third states. But both these issues are absolutely essential in the debate over the post-Snowden relationship between the US and EU. It would therefore be desirable if the CJEU addressed them nonetheless.

Next, another problematic issue here is which set of EU data protection rules should apply: the external transfer rules, or the more stringent standard rules? The national court, along with the data protection authority, applies the external transfer rules, given Facebook’s certification under the Safe Harbour system. However, it is doubtful whether this is correct.

As is well known, in the recent Google Spain judgment, the CJEU ruled that the standard rules applied to Google’s search engine function, given that it had an ‘establishment’ in Spain, according to the Court’s interpretation of the rules. As I then argued on this blog, it probably follows from that judgment that the standard rules apply at least to some social networks like Facebook. In any event, the issue will arise again when the revised jurisdiction and external transfer rules, mentioned above, apply. However, the complainant and the national court assume that the external transfer rules apply. Perhaps the CJEU should also examine this issue of its own motion.

Another problematic issue is the question of how to challenge the inadequacy of data protection in practice in the US, which is the subject of the only question sent to the CJEU. The Safe Harbour agreement addresses this point directly, since it allows national data protection authorities to suspend data transfers as regards an individual company, in accordance with existing national law, if either the US government or the US enforcement system has found a violation of that agreement, or if:

there is a substantial likelihood that the Principles are being violated; there is a reasonable basis for believing that the enforcement mechanism concerned is not taking or will not take adequate and timely steps to settle the case at issue; the continuing transfer would create an imminent risk of grave harm to data subjects; and the competent authorities in the Member State have made reasonable efforts under the circumstances to provide the organisation with notice and an opportunity to respond.

However, Irish national law does not provide for such a system, but simply sets out an irrebutable presumption that the Commission’s adequacy decision is sufficient. This rule may well have played a part in convincing Facebook and the subsidiaries of other US companies to set up in Ireland in the first place.
The challenge argued that the national data protection authority nevertheless had to exercise such powers, and so the national judge asked only whether this was possible. Logically, there can be only one answer, by extension from the NS judgment: Member States cannot create an irrebutable presumption that prevents the exercise of Charter rights, so the national data protection authority must have the powers in question.

In the alternative, or arguably additionally, it must be possible to challenge the validity of the Commission’s adequacy decision in the national courts, which would then have an obligation, if they thought that challenge was well-founded, to send questions on that point to the CJEU. (See the Foto-Frost judgment).

The next problematic issue is the role of the national constitutional protection for human rights. Clearly the national judge believes that the American system breaches the protection for the right to privacy guaranteed in the Irish constitution. Nevertheless, the national court proceeds to examine the issue primarily from the perspective of EU law. So if the CJEU rules against the challenge to the American law on the merits, or does not address those merits for procedural reasons, should the national court proceed to apply Irish law?

In principle, national constitutional law cannot apply here, since EU law, as the national court recognises, has extensively harmonised this issue. This means that, according to the Melloni judgment of the CJEU, only the EU’s human rights standards, in the form of the Charter, can apply. National constitutional standards cannot. But national courts in Ireland (and elsewhere) might be unwilling to accept that outcome.

National law would only apply if the CJEU rules that this issue falls entirely outside the scope of the Directive, as discussed above. If, on the other hand, the processing falls within a public security derogation from the Directive, the EU Charter would apply, by analogy with the CJEU’s recent judgment in Pfleger (discussed here), in which it ruled that the Charter applies to national derogations from EU free movement law. This parallels the argument (discussed here) that national data retention law falls within the scope of EU law, following the Digital Rights judgment, because it is a derogation from the EU’s e-privacy Directive.

Finally, the consequences of any future finding by the national data protection authority that transfers under the Safe Harbour decision must be suspended as regards Facebook must be considered. Assuming that the US had not changed its law in the meantime, Facebook would have a dilemma: should it comply with its US legal obligations, or face the suspension of transfers of data from Europe? Possibly it could avoid this dilemma by ensuring that it only processed EU residents’ data within the EU, potentially avoiding the scope of US law. But this might be expensive, and in any event the US might seek to extend the scope of its law to cover such cases. These issues would inevitably arise for other major US companies as well.

Any real prospect that Facebook transfers from the EU might be blocked would cause a major earthquake in EU/US relations, making the concerns about the recent Google Spain judgment look like a minor tremor. It may be that the only solution is for the US to take more seriously its ongoing discussions with the EU on data protection issues, with a view to reaching a solution that reconciles its security concerns with the basic principles of privacy protection.

Data Protection after Lisbon and the Charter : with the “Google” ruling the CJEU deals with possible abuses by private companies…

ORIGINAL PUBLISHED ON EU LAW ANALYSIS

ORIGINAL TITLE : The CJEU’s Google Spain judgment: failing to balance privacy and freedom of expression
By Steve Peers

The EU’s data protection Directive was adopted in 1995, when the Internet was in its infancy, and most or all Internet household names did not exist. In particular, the first version of the code for Google search engines was first written the following year, and the company was officially founded in September 1998 – shortly before Member States’ deadline to implement the Directive.
Yet, pending the completion of negotiations for a controversial revision of the Directive proposed by the Commission, this legislation remains applicable to the Internet as it has developed since. Many years of controversy as to whether (and if so, how) the Directive applies to key elements of the Web, such as social networks, search engines and cookies have culminated today in the CJEU’s judgment in GoogleSpain, which concerns search engines.

The background to the case, as further explained by Lorna Woods, concerns a Spanish citizen who no longer wanted an old newspaper report on his financial history (concerning social security debts) to be available via Google. Of course, the mere fact that he has brought this legal challenge likely means that that the details of his financial history will become known even more widely – much as many thousands of EU law students have memorised the name of Mr. Stauder, who similarly brought a legal challenge with a view to keeping his financial difficulties private, resulting in the first CJEU judgment on the role of human rights in EU law.

The Court’s judgment Continue reading “Data Protection after Lisbon and the Charter : with the “Google” ruling the CJEU deals with possible abuses by private companies…”