The Mejiers Committee on the inter-parliamentary scrutiny of Europol

ORIGINAL PUBLISHED ON THE MEJIERS COMMITTE (*) PAGE  HERE

  1. Introducton

Article 88 TFEU provides for a unique form of scrutiny on the functioning of Europol. It lays down that the [regulations on Europol] shall also lay down the procedures for scrutiny of Europol’s activities by the European Parliament, together with national Parliaments.

Such a procedure is now laid down in Article 51 of the Europol Regulation (Regulation (EU) 2016/794), which provides for the establishment of a “specialized Joint Parliamentary Scrutiny Group (JPSG)”, which will play the central role in ensuring this scrutiny. The Europol Regulation shall apply from 1st of May 2017.

Article 51 of the Europol Regulation also closely relates to Protocol (1) of the Lisbon Treaty on the role of national parliaments in the EU. Article 9 of that protocol provides: “The European Parliament and national Parliaments shall together determine the organization and promotion of effective and regular inter-parliamentary cooperation within the Union.”

Article 51 (2) does not only lay down the basis for the political monitoring of Europol’s activities (the democratic perspective), but also stipulates that “in fulfilling its mission”, it should pay attention to the impact of the activities of Europol on the fundamental rights and freedoms of natural persons (the perspective of the rule of law).

The Meijers Committee takes the view that improving the inter-parliamentary scrutiny of Europol, with appropriate involvement of both the national and the European levels, will by itself enhance the attention being paid by Europol on the perspectives of democracy and the rule of law, and more in particular the fundamental rights protection. It will raise the alertness of Europol as concerns these perspectives.

Moreover, the scrutiny mechanism could pay specific attention to the fundamental rights protection within Europol. This is particularly important in view of the large amounts of – often sensitive – personal data processed by Europol and exchanged with national police authorities of Member States and also with authorities of third countries.

The implementation of Article 51 into practice is currently debated, e.g. in the inter-parliamentary committee of the European Parliament and national parliaments.1 As specified by Article 51 (1) of the Europol regulation, the organization and the rules of procedure of the JPSG shall be determined.

The Meijers Commitee wishes to engage in this debate and makes, in this note, recommendations on the organization and rules of procedure.

  1. Context

Continue reading

TELE2 SVERIGE AB AND WATSON ET AL: CONTINUITY AND …RADICAL CHANGE

ORIGINAL PUBLISHED ON EUROPEAN LAW BLOG  (JANUARY 12, 2017)
By Orla Lynskey

 

Introduction

The CJEU delivered its judgment in Tele2 Sverige AB and Watson on 21 December 2016. The Court had been asked by a Swedish and British court respectively to consider the scope and effect of its previous judgment in Digital Rights Ireland (discussed here). The judgment reflects continuity in so far as it follows in the line of this, and earlier judgments taking a strong stance on data protection and privacy. Yet, the degree of protection it offers these rights over competing interests, notably security, is radical. In particular, the Court unequivocally states that legislation providing for general and indiscriminate data retention is incompatible with the E-Privacy Directive, as read in light of the relevant EU Charter rights. While the judgment was delivered in the context of the E-Privacy Directive, the Court’s reasoning could equally apply to other EU secondary legislation or programmes interpreted in light of the Charter. This judgment will be a game-changer for state surveillance in Europe and while it offered an early Christmas gift to privacy campaigners, it is likely to receive a very mixed reaction from EU Member States as such. While national data retention legislation has been annulled across multiple Member States (Bulgaria, Czech Republic, Cyprus, Germany and Romania), this annulment has been based on an assessment of the proportionality of the relevant measures rather than on a finding that blanket retention is per se unlawful. For those familiar with the facts and findings, skip straight to the comment below.

Facts

The preliminary ruling stems from two Article 267 TFEU references regarding the interpretation of the Court’s judgment in Digital Rights Ireland (henceforth DRI). The first, Tele2 Sverige AB, was a Swedish reference resulting from the refusal by Tele2 Sverige (a Swedish electronic communications provider) to continue to retain electronic communications data following the finding in DRI that the Data Retention Directive was invalid. A dispute regarding the interpretation of DRI ensued and the Swedish Justice Minister commissioned a report to assess the compatibility of Swedish law with EU law and the ECHR. This report concluded that DRI could not be interpreted as prohibiting general and indiscriminate data retention as a matter of principle, or as establishing criteria – all of which must be fulfilled – in order for legislation to be deemed proportionate. Rather, it held that it was necessary to conduct an assessment of all the circumstances in order to determine the compatibility of Swedish legislation with EU law. Tele2 Sverige maintained that the report was based on a misinterpretation of DRI. Given these differing perspectives, the referring court asked the Court to give ‘an unequivocal ruling on whether…the general and indiscriminate retention of electronic communications data is per se incompatible with Articles 7 and 8 and 52(1) of the Charter’ [50].

The second preliminary reference (Watson) arose before the Court of Appeal in the context of applications for judicial review of the UK’s Data Retention and Investigatory Powers Act (DRIPA) on the grounds that this Act was incompatible with the EU Charter and the ECHR. It was disputed before the national court whether DRI laid down ‘mandatory requirements of EU law’ that national legislation for communications data retention and access must respect. The domestic referring court suggested that it was appropriate to distinguish between legislation governing retention, and legislation governing access. DRI was confined to an assessment of the former as it assessed the validity of the Data Retention Directive, which excluded provisions relating to data access. The latter, provisions on data access, must be subject to a distinct validity assessment in light of their differing context and objectives, according to the referring court. The Court of Appeal did not however deem the answer to this question obvious, given that six courts in other EU Member States had declared national legislation to be invalid on the basis of DRI. It therefore asked the Court to consider whether, firstly, DRI lays down mandatory requirements of EU law that would apply to the regime governing access to retained data at national level. It also asked whether DRI expands the scope of the Charter rights to data protection and privacy beyond the scope of Article 8 ECHR. The Watson reference was dealt with pursuant to the expedited procedure provided for in Article 105(1) of the Court’s Rules of Procedure and joined to the Tele2 Sverige reference for oral arguments and judgment.

Findings of the Court

The Scope of the E-Privacy Directive

The Court examined, as a preliminary point, whether national legislation on retention and access to data fell within the scope of the E-Privacy Directive. Article 15(1) of that Directive provides for restrictions to certain rights it provides for when necessary for purposes such as national security and the prevention, investigation, detection and prosecution of criminal offences. Article 15(1) also allows for the adoption of data retention legislation by Member States. However, Article 1(3) of that Directive states that the Directive will not apply to, amongst others, ‘activities concerning public security, defence, State security (…) and the activities of the State in areas of criminal law’. There is thus an apparent internal inconsistency within the Directive.

To guide its findings, the Court had regard to the general structure of the Directive. While the Court acknowledged that the objectives pursued by Articles 1(3) and 15(1) overlap substantially, it held that Article 15(1) of the Directive would be deprived of any purpose if the legislative measures it permits were excluded from the scope of the Directive on the basis of Article 1(3) [73]. Indeed, it held that Article 15(1) ‘necessarily presupposes’ that the national measures referred to therein fall within the scope of that directive ‘since it expressly authorizes the Member States to adopt them only if the conditions laid down in the directive are met’. [73]. In order to support this finding, the Court suggests that the legislative measures provided for in Article 15(1) apply to providers of electronic communications services [74] and extend to measures requiring data retention [75] and access to retained data by national authorities [76]. It justifies this final claim – that the E-Privacy Directive includes data access legislation – on the (weak) grounds that recital 21 of the directive stipulates that the directive’s aim is to protect confidentiality by preventing unauthorised access to communications, including ‘any data related to such communications’ [77]. The Court emphasises that provisions on data access must fall within the scope of the Directive as data is only retained for the purpose of access to it by competent national authorities and thus national data retention legislation ‘necessarily entails, in principle, the existence of provisions relating to access by the competent national authorities to the data retained’ [79]. The Court also noted that the Directive requires providers to establish internal procedures for responding to requests for access based on the relevant provisions of national law [80].

The compatibility of ‘general and indiscriminate’ data retention with EU law

The Court then moved on to consider the most important substantive point in the judgment: the compatibility of ‘general and indiscriminate’ data retention with the relevant provisions of EU law. It began by recalling that the E-Privacy Directive’s overarching aim is to offer users of electronic communications services protection against the risks to fundamental rights brought about by technological advances [83]. It emphasised, in particular, the general principle of confidentiality of communications in Article 5(1) of the Directive and the related safeguards for traffic data and location data (in Articles 6 and 9 respectively), [85-87]. While the Court acknowledged that Article 15(1) of the Directive allows for exceptions to these principles by restricting their scope, it held that this provision must be interpreted strictly. It clearly stated that Article 15(1) cannot permit the exception to the Directive’s confidentiality obligation to become the rule, as this would render the confidentiality obligation meaningless [89].

The Court also emphasised that according to Article 15(1)’s wording it must be interpreted in light of general principles of EU law, thus including the fundamental rights in the EU Charter [91]. The Court noted, with reference to its previous case-law, the importance of the fundamental rights engaged in the current context, namely the right to privacy (Article 7), the right to data protection (Article 8) and the right to freedom of expression (Article 11) ([92]-[93]). The limitations on the exercise of these Charter rights are echoed in the E-Privacy Directive, recital 11 of which states that measures derogating from its principles must be ‘strictly’ proportionate to the intended purpose, while Article 15(1) itself specifies that data retention should be ‘justified’ by reference to one of the  objectives stated in Article 15(1) and be for a ‘limited period’ [95]. In considering whether national legislation complies with these requirements of strict necessity, the Court observed that ‘the legislation provides for a general and indiscriminate retention of all traffic and location data of all subscribers and registered users relating to all means of electronic communication’ and that the retention obligation on providers is ‘to retain the data systematically and continuously, with no exceptions’ [97].

Having established the scope of the retention obligation, the Court emphasised the revealing nature of this data and recalled its finding in DRI that the data ‘taken as a whole, is liable to allow very precise conclusions to be drawn concerning the private lives of the persons whose data has been retained’ [98]. The Court also stated that the data provides the means of profiling the individual concerned and – importantly – that the information is ‘no less sensitive having regard to the right to privacy, than the actual content of the communications’ [99]. The Court held that general and indiscriminate data retention legislation entailed a particularly serious interference with the rights to privacy and data protection and that the user concerned is, as a result, likely to feel that their private lives are the subject of constant surveillance [100]. It could also, according to the Court, affect the use of means of electronic communication and thus the exercise by users of their freedom of expression [101]. The Court therefore held that only the objective of fighting serious crime could justify national data retention legislation [102].

While the Court acknowledged that the fight against serious crime may depend on modern investigative techniques for its effectiveness, this objective cannot in itself justify the finding that general and indiscriminate data retention legislation is necessary for this fight against crime [103]. It noted in particular that such legislation applies to persons for whom ‘there is no evidence capable of suggesting that their conduct might have a link, even an indirect or remote one, with serious criminal offences’ and that no exception is made for those whose communications are subject to professional secrecy [106]. As a result of these failings, the Court held that the national legislation exceeds the limits of what is strictly necessary and cannot be considered justified under Article 15(1), read in light of the Charter [107].

The Court did not go so far as to deem all data retention unlawful however. It highlighted that Article 15(1) does not prevent a Member State from introducing legislation that would facilitate targeted retention of traffic and location data for the preventive purpose of fighting serious crime. Such legislation must however be limited to what is strictly necessary in terms of the categories of data retained; the means of communication affected, the persons and the period of time concerned [108]. In particular, such legislation should indicate ‘in what circumstances and under which conditions’ a data retention measure could be adopted as a preventive measure [109]. The Court also emphasised that while the precise contours may vary, data retention should meet objective criteria that establish a connection between the data to be retained and the objective pursued [110]. The national legislation must therefore be evidence-based: this objective evidence should make it possible to ‘identify a public whose data is likely to reveal a link, at least an indirect one, with serious criminal offences’ [111].

Mandatory Requirements of DRI?

Having established the incompatibility of generalised data retention legislation with EU law, the Court then went on to consider whether EU law precludes national data retention and access legislation if that legislation:

  • does not restrict access solely to the objective of fighting serious crime;
  • does not require access to be subject to prior review by a court or independent body
  • and, if it does not require that the data should be retained within the EU [114].

The Court reiterated an early finding that access to retained data must be for one of the exhaustive objectives identified in Article 15(1) of the E-Privacy Directive, and that only the objective of fighting serious crime would justify access to retained data [115]. Such legislation must also set out clear and precise rules indicating when and how competent national authorities should be granted access to such data [117]. The Court also held that national legislation must set out the substantive and procedural conditions governing access based on objective criteria [118-119]. Such access can, ‘as a general rule’ be granted only ‘to the data of individuals suspected of planning, committing or having committed a serious crime or of being implicated in one way or another in such a crime’ [119]. Access to the data of others might exceptionally be granted where, for instance, vital national interests are threatened by terrorist activities, if there is objective evidence to reflect the effective contribution access to such data could make [119]. As a result, access to retained data should, with the exception of cases of validly established urgency, be subject to a prior review by a court or an independent administrative authority at the request of the competent national authorities [120]. These competent national authorities must also notify the persons affected by the data access, under the applicable national procedures, as soon as such notification no longer jeopardises the investigations. The Court highlighted that such notice is necessary to enable these individuals to exercise their right to a legal remedy pursuant to the Directive and EU data protection law [121].

On the issue of data security, the Court held that Article 15(1) does not allow Member States to derogate from the Directive’s data security provisions, which require providers to take appropriate technical and organisational measures to ensure the effective protection of retained data. The Court held that a particularly high level of data security was appropriate given the quantity and nature of the data retained and the riskiness of this operation. It therefore held that the national legislation must provide for the data to be retained within the EU, and for the irreversible destruction of the data at the end of the data retention period [122]. Member States must also ensure that an independent authority reviews compliance with EU law, as such independent control of data protection compliance is an essential element of the right to data protection set out in Article 8(3) Charter. The Court emphasised the link between such independent supervision and the availability of a legal remedy for data subjects [123]. The Court therefore concluded that national legislation that did not comply with these conditions would be precluded pursuant to Article 15(1) as read in light of the Charter [125]. However, it was for the relevant national courts to examine whether such conditions were satisfied in the present case [124].

Finally, in relation to the UK Court of Appeal’s query regarding the relationship between the EU Charter rights to data protection and privacy and Article 8 ECHR, the Court held that the answer to this question would not affect the interpretation of the E-Privacy Directive and thus matter in these proceedings [131]. It recalled its settled case-law that the preliminary reference procedure serves the purpose of effectively resolving EU law disputes rather than providing advisory opinions or answering hypothetical questions [130]. This did not however prevent it from offering a sneak preview of its thinking on this matter. It emphasised that, while the EU has not acceded to the ECHR, the ECHR does not constitute a formally incorporated element of EU law. It did however note that Article 52(3) seeks to ensure consistency between the Charter and the ECHR without adversely affecting the autonomy of EU law. EU law is not therefore precluded from providing more extensive protection than the ECHR. The Court added that Article 8 of the Charter concerns a fundamental right which is distinct from that enshrined in Article 7 and which has no equivalent in the ECHR. Therefore, while the Court did not answer the question of which offered a wider scope of protection, it did confirm the distinctiveness of these two rights.

Comment

The Tele2 judgment represents a rupture with the past in one very significant way: the Court, for the first time, unequivocally states that blanket data retention measures are incompatible with EU law, read in light of the Charter. This radical finding is likely to receive a mixed reaction. For instance, in the UK some will lament that this judgment comes too late to have influenced the passage into law of the UK’s new data retention legislation, the Investigatory Powers Act, 2016. This legislation – which allows for bulk interception and hacking, amongst other things – should now be found to be incompatible with EU law, with all of the post-Brexit implications for ‘adequacy’ this may entail (also here). Others, such as the UK’s Independent Reviewer of Terrorism Legislation – David Anderson QC – have expressed regret. Anderson QC suggests that:

‘Precisely because suspects are often not known in advance, data retention which is not universal in its scope is bound to be less effective as a crime reduction measure.  In addition, a person whose data has not been retained cannot be exonerated by use of that data (e.g. by using location data to show that the person was elsewhere).’

The Advocate General (here; and commentary here) had similarly noted that data retention could help competent authorities ‘examine the past’ [AG, 178]. He had refused to declare general retention measures per se unlawful, preferring instead to assess the compatibility of data retention legislation against strict proportionality requirements [AG, 116]. His approach could therefore be said to be more nuanced and systematic than that of the Court. While examining proportionality stricto sensu he concluded that it would be for national courts to weigh the benefit of ‘examining the past’ with the potential it would provide for authorities to abuse this power by using metadata to catalogue entire populations, noting that evidence of abuses had been put before the Court [AG, 259-260]. This evidence before the Court might help to refute the critique that the Court should have focused on the actual harm of communications metadata retention ‘and sought to avoid assertions based on theory or informal predictions of popular feeling’.

Blanket retention was not the only important point on which the Court and the Advocate General departed. The Advocate General explicitly claimed that DRI set out mandatory requirements [AG, 221] while the Court did not. The Advocate General was also more stringent than the Court by requiring that data is retained in the relevant Member State [AG, 241] while the Court opted for the marginally more realistic requirement that data is retained in the EU. The Advocate General did not, however, consider Article 15(1) a derogation to the E-Privacy Directive (and therefore not a provision that required strict interpretation). The Court did not however engage with his elaborate reasoning on this point [AG, 106-115]. The Court did however confirm that competent national authorities must notify persons affected by data access as soon as such notification no longer jeopardises the investigation [121]. This significant procedural right is likely to play an important role in acting as a check on abusive access requests.

Perhaps the only fly in the ointment for the digital rights groups that intervened before the Court is the Court’s seemingly uncritical endorsement of geographic and group profiling. It does this when it emphasises that there should be relationship between the data retained and the threat, for instance when the data pertains to a ‘geographic area’ [108]. The ethical and social issues such profiling may entail would require further consideration. The Court appears to recognise this by suggesting that such profiling would need to be strictly evidence-based ([111]). Should generalised retention measures be replaced by ad hoc location-based retention measures, the legality of the latter would itself be the subject of much controversy.

Threat to Human Rights? The new e-Privacy Regulation and some thoughts on Tele2 and Watson

ORIGINAL PUBLISHED ON EU LAW ANALYSIS

by Matthew White, Ph.D candidate, Sheffield Hallam University

Introduction

In a follow-up to last Christmas’s post, on 10 January 2017, the European Commission released the official version of the proposed Regulation on Privacy and Electronic Communications (e-Privacy Regs). Just as the last post concerned the particular aspect of data retention, this post will too.

Just as the former leaked version maintained, the proposal does not include any specific provisions in the field of data retention (para 1.3). This paragraph continues that Member States are free to keep or create national data retention laws, provided that they are ‘targeted’ and that they comply with European Union (EU) taking into account the case-law of the Court of Justice of the European Union (CJEU) and its interpretation of the e-Privacy Directive and the Charter of Fundamental Rights (CFR). Regarding the CJEU’s interpretation, the proposals specifically refers to Joined Cases C-293/12 and C-594/12 Digital Rights Ireland and Seitlinger and Others, and Joined Cases C-203/15 and C-698/15 Tele2 Sverige AB and Secretary of State for the Home Department. Aspects of the latter case is the focus of this post; the case itself has been thoroughly discussed by Professor Lorna Woods.

So, when is the essence of the right adversely affected?

Before discussing certain aspects of Tele2 and Watson, it is first important to draw attention to the provision which enables data retention in the new e-Privacy Regs. Article 11 allows the EU or its Member States to restrict the rights contained in Articles 5-8 (confidentiality of communications, permissions on processing, storage and erasure of electronic communications data and protection of information stored in and related to end-users’ terminal equipment). From Article 11, it is clear that this can include data retention obligations, so long as they respect the essence of the right and are necessary, appropriate and proportionate. In Tele2 and Watson the CJEU noted that any limitation of rights recognised by the CFR must respect the essence of said rights [94]. The CJEU accepted the Advocate General (AG)’s Opinion that data retention creates an equally serious interference as interception and that the risks associated with the access to communications maybe greater than access to the content of communications [99]. Yet the CJEU were reluctant to hold that data retention (and access to) adversely affects the essence of those rights [101]. This appears to highlight a problem in the CJEU’s reasoning, if the CJEU, like the AG accept that retention of and access to communications data is at least on par with access to the content, it makes little sense to then be reluctant to hold that data retention adversely affects the essence of those rights. The CJEU does so without making any distinction or reasoning for this differential treatment, and thus serves to highlight that perhaps the CJEU themselves do not fully respect the essence of those rights in the context of data retention.

The CJEU’s answer seems only limited catch all powers

The thrust of the CJEU’s judgment in Tele2 and Watson was that general and indiscriminate data retention obligations are prohibited at an EU level. But as I have highlighted previously, the CJEU’s answer was only in response to a very broad question from Sweden, which asked was:

[A] general obligation to retain traffic data covering all persons, all means of electronic communication and all traffic data without any distinctions, limitations or exceptions for the purpose of combating crime…compatible with [EU law]?

Therefore, provided that national laws do not provide for the capturing of all data of all subscribers and users for all services in one fell swoop, this may be argued to be compatible with EU law. Both the e-Privacy Regs and the CJEU refer to ‘targeted’ retention [108, 113]. The CJEU gave an example of geographical criterions for retention in which David Anderson Q.C. asks whether the CJEU meant that ‘it could be acceptable to perform “general and indiscriminate retention” of data generated by persons living in a particular town, or housing estate, whereas it would not be acceptable to retain the data of persons living elsewhere? This is entirely possible given the reference from Sweden and the answer from the CJEU. In essence the CJEU have permitted discriminatory general and indiscriminate data retention which would in any event respect the essence of those rights.

Data retention is our cake, and only we can eat it

A final point on Tele2 and Watson was that the CJEU held that national laws on data retention are within the scope of EU law [81]. This by itself may not raise any concerns about protecting fundamental rights, but it is what the CJEU rules later on in the judgment that may be of concern. The CJEU held that the interpretation of the e-Privacy Directive (and therefore national Member State data retention laws) “must be undertaken solely in the light of the fundamental rights guaranteed by the Charter” [128]. The CJEU has seemingly given itself exclusive competence to determine how rights are best protected in the field of data retention. It is clear from the subsequent paragraph that the CJEU seeks to protect the autonomy of EU law above anything else, even fundamental rights [129]. This is despite the ECHR forming general principles of EU law and is mentioned in Article 15(1) (refers Article 6(3) of the Treaty of the European Union (TEU) specifically referring to the ECHR as such). Article 11 of the e-Privacy Regs refers to restrictions respecting the ‘essence of fundamental rights and freedoms’ and only time will tell whether the CJEU would interpret this as only referring to the CFR. Recital 27 of the e-Privacy Regs just like Recital 10 and 30 of the e-Privacy Directive refers to compliance with the ECHR, but as highlighted previously, Recitals are not legally binding.

Is the CJEU assuming too much?

A further concern, is that had the European Commission added general principles of EU law into Article 11, the CJEU may simply have ignored it, just as it has done in Tele2 and Watson. The problem with the CJEU’s approach is that it assumes that this judgment offers an adequate protection of human rights in this context. The ECHR has always been the minimum floor, but it appears the CJEU wants the CFR to be the ceiling whether it be national human rights protection, or protection guaranteed by the ECHR. What if that ceiling is lower than the floor? The AG in Tele2 and Watson stressed that the CFR must never be inferior to the ECHR [141]. But I have argued before, the EU jurisprudence on data retention is just that, offering inferior protection to the ECHR, and the qualification by the CJEU in Tele2 and Watson does not alter this. This position is strengthened by Judge Pinto De Albuquerque in his concurring opinion in the European Court of Human Rights judgment in Szabo. He believed that:

[M]andatory third-party data retention, whereby Governments require telephone companies and Internet service providers to store metadata about their customers’ communications and location for subsequent law-enforcement and intelligence agency access, appeared neither necessary nor proportionate [6].

Of course, Judge Pinto De Albuquerque could have been referring to the type of third party data retention which requires Internet Service Providers (ISPs) to intercept data from Over The Top (OTT) services, but his description is more in line with data retention of services’ own users and subscribers.

Conclusions

Although the CJEU has prohibited general indiscriminate data retention, the CJEU does not seem to have prevented targeted indiscriminate data retention. If the European Court of Human Rights (ECtHR) were to ever rule on data retention and follow its jurisprudence and the opinion of Judge Pinto De Albuquerque, this may put EU law in violation of the ECHR. This would ultimately put Member States in a damned if they do, damned if they do not situation, comply with the ECHR, and violate EU law autonomy; comply with EU law and violate the ECHR. When the minimum standards of human rights protection in this context are not adhered to, because of EU law, the ECHR should prevail. As anything less is a threat to human rights, meaning that the (even if well intentioned) CJEU can also be.

New ECJ ruling on data retention: Preservation of civil rights even in difficult times!

Original published here on 22. Dezember 2016

by

Translation – German version see here.

The European Court of Justice has made a Christmas present to more than 500 million EU citizens. With its new judgment on data retention (C-203/15 of 21 December 2016) – the highest court of the European Union stresses the importance of fundamental rights. All Member States are required to respect the rights represented in the European Charter of Fundamental Rights in their national legislation. The ECJ issued an important signal that can hardly be surmounted taking into account the current political discussions on internal and external threats and the strengthening of authoritarian political currents providing the public with simplistic answers to difficult questions.

The ECJ remains true to itself

The ruling of the European Court of Justice is in line with its judgment of 8 April 2014, by which the Court annulled Directive 2006/24/EC on the retention of data. The general obligation to retain traffic and location data required by this Directive was not limited to the absolutely necessary and thus disproportionate to the fundamental rights of respect for private life and the protection of personal data (Articles 7 and 8 of the European Charter of Fundamental Rights).

Despite the annulment of the Data Retention Directive by the ECJ, several Member States have continued or even broadened their practice of data retention. The latter took place in Great Britain, where shortly after the ECJ ruling – in July 2014 – a new legal basis for data retention was passed, which even went beyond the abolished EC directive. According to the British Parliament’s intention to implement the so-called „Investigatory Powers Act“, the major current commitments to compulsory data storage and the supervisory powers of the security authorities are to be extended in the short term and will include web services, in particular transactions on social networks. On November 29, 2016, the upper and lower house agreed on a corresponding legal text, which is to enter into force soon after its formal approval by the Queen. In other Member States, too, there are – differently broad-ranging – legal requirements which oblige providers of telecommunications and internet services to reserve traffic and location data whose conservation is not necessary for the provision or the billing of the respective service.

European Charter of Fundamental Rights binding for national legislature

A Swedish and a British court had asked the ECJ to clarify whether the respective national regulations on the retention of data corresponded to the European legal requirements.

In its new ruling the ECJ answered this question by stating that national regulations which provide a general and indiscriminate storage of data are not in line with the EU law. A national regulation providing for the storage of traffic and location data, is to be regarded as serious interference in fundamental rights. Member States must not maintain or re-adopt rules which are based on, or even go beyond, an EU act which has been annulled on grounds of its fundamental illegality.

The provisions of EU law bind the national legislature. The EU Directive 2002/58/EC on data protection in electronic communications (the ePrivacy Directive) has to be interpreted in the light of the Charter of Fundamental Rights. Exceptions to the protection of personal data should be limited to the absolutely necessary. This applies not only to the rules on data retention, but also to the access of authorities to the stored data. A national provision providing for general and indiscriminate data retention which does not require a link between the data for which it is originally intended to be stored and a threat to public security, and in particular is not limiting the data on a period and / or a geographical area and / or of a group of persons which could be involved in a serious criminal act, transcends the limits of the absolutely necessary and can not be regarded as justified in a democratic society. Laws of Member States that do not meet these requirements must be abolished or amended accordingly.

With regard to the contested British and Swedish laws, the competent national courts which had appealed to the ECJ are now required to enforce the ECJ ruling in substance. However, even the parliaments and governments of the Member States are, too, responsible for reviewing and, where appropriate, correcting the relevant provisions of national law.

What happens to German data retention?

The implications of the ECJ ruling for the German data retention recently reintroduced must also be urgently examined. The retention obligations of the new German Data Retention Act remain behind the predecessor regulation, which was repealed by the Federal Constitutional Court in 2010. However, it is highly doubtful whether the provisions of the ECJ will be fulfilled by the new data retention act, since it obliges the telecommunications providers to store the data without any material restriction on a specific area or a particular risk situation.

The fact that the Federal Government or the parliamentary fractions backing them will now carry out this examination in an objective manner appears to be highly unlikely in the light of the additional powers which they have recently decided to hand over to the security authorities. In the end, the Federal Constitutional Court will probably have to ensure clarity again.

Peter Schaar (21 December 2016)

Data retention and national law: the ECJ ruling in Joined Cases C-203/15 and C-698/15 Tele2 and Watson (Grand Chamber)

ORIGINAL PUBLISHED ON EU LAW ANALYSIS

Lorna Woods, Professor of Internet Law, University of Essex

Introduction

Today’s judgment in these important cases concerns the acceptability from a human rights perspective of national data retention legislation maintained even after the striking down of the Data Retention Directive in Digital Rights Ireland (Case C-293/12 and 594/12) (“DRI”) for being a disproportionate interference with the rights contained in Articles 7 and 8 EU Charter of Fundamental Rights (EUCFR).  While situated in the context of the Privacy and Electronic Communications Directive (Directive 2002/58), the judgment sets down principles regarding the interpretation of Articles 7 and 8 EUCFR which will be applicable generally within the scope of EU law. It also has possible implications for the UK’s post-Brexit relationship with the EU.

Background and Facts

The Privacy and Electronic Communications Directive requires the confidentiality of communications, including the data about communications to be ensured through national law. As an exception it permits, under Article 15, Member States to take measures for certain public interest objectives such as the fight against terrorism and crime, which include requiring public electronic communications service providers to retain data about communications activity. Member States took very different approaches, which led to the enactment of the Data Retention Directive (Directive 2006/24) within the space for Member State action envisaged by Article 15.  With that directive struck down, Article 15 remained the governing provision for exceptions to communications confidentiality within the field harmonised by the Privacy and Electronic Communications Directive.  This left questions as to what action in respect of requiring the retention of data could be permissible under Article 15, as understood in the light of the EUCFR.

The cases in today’s judgment derive from two separate national regimes. The first, concerning Tele2, arose when – following the DRI judgment – Tele2 proposed to stop retaining the data specified under Swedish implementing legislation in relation to the Data Retention Directive. The second arose from a challenge to the Data Retention and Investigatory Powers Act 2014 (DRIPA) which had been enacted to provide a legal basis in the UK for data retention when the domestic regime implementing the Data Retention Directive fell as a consequence of the invalidity of that directive.  Both sets of questions referred essentially asked about the impact of the DRI reasoning on national regimes, and whether Articles 7 and 8 EUCFR constrained the States’ regimes.

The Advocate General handed down an opinion in July (noted here) in which he opined that while mass retention of data may be possible, it would only be so when adequate safeguards were in place.  In both instances, the conditions – in particular those identified in DRI – were not satisfied.

Judgment

Scope of EU Law

A preliminary question is whether the data retention, or the access of such data by police and security authorities, falls within EU law.  While the Privacy and Electronic Communications Directive regulated the behaviour of communications providers generally, Article 1(3) of that Directive specifies that matters covered by Titles V and VI of the TEU at that time (e.g. public security, defence, State security) fall outside the scope of the directive, which the Court described as relating to “activities of the State” . Further Article 15(1) permits the State to take some measures resulting in the infringement of the principle of confidentiality found in Art 5(1) which again “concern activities characteristic of States or State authorities, and are unrelated to fields in which individuals are active” [para 72]. While there seems to be overlap between Article 1(3) and Article 15(1), this does not mean that matters permitted on the basis of Article 15(1) fall outside the scope of the directive as “otherwise that provision would be deprived of any purpose” [para 73].

In the course of submissions to the Court, a distinction was made between the retention of data (by the communications providers) and access to the data (by police and security services).  Accepting this distinction would allow a line to be drawn between the two, with retention as an activity of the commercial operator regulated by the Privacy and Electronic Communications Directive within its scope and the access, as an activity of the State lying outside it. The Court rejected this analysis and held that both retention and access lay within the field of the Privacy and Electronic Communications Directive [para 76]. It argued that Article 5(1) guarantees confidentiality of communications from the activities of third parties whether they be private actors or state authorities. Moreover, the effect of the national legislation is to require the communications providers to give access to the state authorities which in itself is an act of processing regulated by the Privacy and Electronic Communications Directive [para 78]. The Court also noted that the sole purpose of the retention is to be able to give such access.

Interpretation of Article 15(1)

The Court noted that the aim of the Privacy and Electronic Communications Directive is to ensure a high level of protection for data protection and privacy. Article 5(1) established the principle of confidentiality and that “as a general rule, any person other than the user is prohibited from storing, without the consent of the users concerned, the traffic data”, subject only to technical necessity and the terms of Article 15(1) (citing Promusicae) [para 85].  This requirement of confidentiality is backed up by the obligations in Article 6 and 9 specifically dealing with restrictions on the use of traffic and location data. Moreover, Recital 30 points to the need for data minimisation in this regard [para 87]. So, while Article 15(1) permits exceptions, they must be interpreted strictly so that the exception does not displace the rule; otherwise the rule would be “rendered largely meaningless” [para 89].

As a result of this general orientation, the Court held that Member States may only adopt measures for the purposes listed in the first sentence of Article 15(1) and those measures must comply with the requirements of the EUCFR.  The Court, citing DRI (at paras 25 and 70), noted that in addition to Articles 7 and 8 EUCFR, Article 11 EUCFR – protecting freedom of expression – was also in issue. The Court noted the need for such measures to be necessary and proportionate and highlighted that Article 15 provided further detail in the context of communications whilst Recital 11 to the Privacy and Electronic Communications Directive requires measures to be “strictly proportionate” [para 95].

The Court then considered these principles in the light of the reference in Tele2 at paras 97 et seq of its judgment. Approving expressly the approach of the Advocate General on this point, it  underlined that communications “data, taken as a whole, is liable to allow very precise conclusions to be drawn concerning the private lives of the persons whose data has been retained” and that such data is no less sensitive that content [para 99]. The interference in the view of the Court was serious and far-reaching in relation to Articles 7, 8 and 11.  While Article 15 identifies combatting crime as a legitimate objective, the Court – citing DRI – limited this so that only the fight against serious crime could be capable of justifying such intrusion.  Even the fight against terrorism “cannot in itself justify that national legislation providing for the general and indiscriminate retention of all traffic and location data should be considered necessary” [para 103].  The Court stressed that the regime provides for “no differentiation, limitation or exception according to objectives pursued” [para 105].  The Court did confirm that some measures would be permissible:

… Article 15(1) of Directive 2002/58, read in the light of Articles 7, 8 and 11 and Article 52(1) of the Charter, does not prevent a Member State from adopting legislation permitting, as a preventive measure, the targeted retention of traffic and location data, for the purpose of fighting serious crime, provided that the retention of data is limited, with respect to the categories of data to be retained, the means of communication affected, the persons concerned and the retention period adopted, to what is strictly necessary. [para 108]

It then set down some relevant conditions:

Clear and precise rules “governing the scope and application of such a data retention measure and imposing minimum safeguards, so that the persons whose data has been retained have sufficient guarantees of the effective protection of their personal data against the risk of misuse” [para 109].

while “conditions may vary according to the nature of the measures taken for the purposes of prevention, investigation, detection and prosecution of serious crime, the retention of data must continue nonetheless to meet objective criteria, that establish a connection between the data to be retained and the objective pursued” [110].

The Court then emphasised that there should be objective evidence supporting the public whose data is to be collected on the basis that it is likely to reveal a link, even an indirect one, with serious criminal offences, and thereby contribute in one way or another to fighting serious crime or to preventing a serious risk to public security. The Court accepted that geographical factors could be one such ground, on the basis that “that there exists, in one or more geographical areas, a high risk of preparation for or commission of such offences” [para 111].

Conversely,

…Article 15(1) of Directive 2002/58, read in the light of Articles 7, 8 and 11 and Article 52(1) of the Charter, must be interpreted as precluding national legislation which, for the purpose of fighting crime, provides for the general and indiscriminate retention of all traffic and location data of all subscribers and registered users relating to all means of electronic communication [para 112].

Acceptability of legislation where (1) the measure is not limited to serious crime; (2) where there is no prior review; and (3) where there is no requirement that the data stays in the EU.

This next section deals with the first question referred in the Watson case, as well as the Tele 2 reference.

As regards the first point, the answer following the Court’s approach at paragraphs 90 and 102 is clear: only measures justified by reference to serious crime would be justifiable.  As regards the second element, the Court noted that it is for national law to law conditions of access so as to ensure that the measure does not exceed what is strictly necessary.  The conditions must be clear and legally binding. The Court argued that since general access could not be considered strictly necessary, national legislation must set out by reference to objective criteria the circumstances in which access would be permissible.  Referring to the European Court of Human Rights (ECtHR) judgment in Zakharov, the Court specified:

access can, as a general rule, be granted, in relation to the objective of fighting crime, only to the data of individuals suspected of planning, committing or having committed a serious crime or of being implicated in one way or another in such a crime [para 119].

It then distinguished the general fight against crime from the fight against terrorism to suggest that in the latter case:

access to the data of other persons might also be granted where there is objective evidence from which it can be deduced that that data might, in a specific case, make an effective contribution to combating such activities [para 119].

The conditions set down must be respected. The Court therefore held that, save in cases of genuine emergency, prior review by an independent body must be carried out on the basis of a reasoned request by the investigating bodies. In making this point, the Court referred to the ECtHR judgment in Szabó and Vissy v. Hungary, as well as its own previous ruling in DRI. Furthermore, once there was no danger to the investigation by so doing, individuals affected should be notified, so as to those affected people the possibility to exercise their right to a remedy as specified in Article 15(2) read with Article 22 of the Data Protection Directive (Directive 95/46).

Article 15(1) permits derogation only in relation to specified provisions in the directive; it does not permit derogation with regard to the security obligations contained in Article 4(1) and 4(1a). the Court noted the quantity of data as well as its sensitivity to suggest that a high level of security measures would be required on the part of the electronic communications providers. Following this, the Court then stated:

…, the national legislation must make provision for the data to be retained within the European Union and for the irreversible destruction of the data at the end of the data retention period (see, by analogy, in relation to Directive 2006/24, the Digital Rights judgment, paragraphs 66 to 68) [para 122].

The Court noted that as a separate obligation from the approval of access to data, that States should ensure that independent review of compliance with the required regulatory framework was carried out by an independent body. In the view of the Court, this followed from Article 8(3) EUCFR. This is an essential element of individuals’ ability to make claims in respect of infringements of their data protection rights, as noted previously in DRI and Schrems.

The Court then summarised the outcome of this reasoning, that Article 15 and the EUCFR:

must be interpreted as precluding national legislation governing the protection and security of traffic and location data and, in particular, access of the competent national authorities to the retained data, where the objective pursued by that access, in the context of fighting crime, is not restricted solely to fighting serious crime, where access is not subject to prior review by a court or an independent administrative authority, and where there is no requirement that the data concerned should be retained within the European Union. [para 125]

Relationship between the EUCFR, EU law and the ECHR

The English Court of Appeal had referred a question about the impact of the ECHR on the scope of the EUCFR in the light of Article 52 EUCFR. While the Court declared the question inadmissible, it –like the Advocate General – took the time to point out that the ECHR is not part of EU law, so the key issue is the scope of the EUCFR; and in any event Article 52(3) does not preclude Union law from providing protection that is more extensive than the ECHR. As a further point, the Court added that Article 8 EUCFR, which provides a separate right to data protection, does not have an exact equivalent in the ECHR and that there is therefore a difference between the two regimes.

Comment

Given the trend of recent case law, the outcome in this case is not surprising.  There are some points that are worth emphasising.

The first relates to the scope of EU law, which is a threshold barrier to any claim based on the EUCFR.  The Advocate General seemed prepared to accept a distinction between the retention of data and the access thereto (although conditions relating to the latter could bear on the proportionality of the former).  The Court took a different approach and held that the access also fell within the scope of the Directive/EU law, because the national regime imposed an obligation on the communications service provider to provide access to the relevant authorities. Given this was an obligation on the service provider, it fell within the regulatory schema.  This approach thus avoids the slightly unconvincing reasoning which the Advocate General adopted.  It also possibly enlarges the scope of EU law.

In general terms, the Court’s reasoning looks at certain provisions of the Privacy and Electronic Communications Directive.  While the reasoning is set in that context, it does not mean that the Court’s interpretation of the requirements deriving from the EUCFR is limited only to this set of surveillance measures.  The rules of interpretation of particularly Articles 7 and 8 could apply more generally – perhaps to PNR data (another form of mass surveillance) – and beyond.  It is also worth noting that according to a leaked Commission document, it is proposed to extend the scope of the Privacy and Electronic Communications Directive to other communications service providers not currently regulated by the directive, but who may be subject to some data retention requirements already.

Whilst the Court makes the point that Articles 7 and 8 EUCFR are separate and different, and that data retention implicates also Article 11 EUCFR, in its analysis of the impact of national measures providing for retention it does not deal with Articles 7 and 8 separately (contrast DRI where a limited consideration was given to this). Having flagged Article 11 EUCFR, it takes that analysis no further.  This is the leaves questions as to the scope of the rights, and particularly how Article 11 issues play out.

Note that the Court does not state that data retention itself is impermissible; indeed, it specifies circumstances when data retention would be acceptable. It challenges the compatibility of mass data retention with Articles 7 and 8 EUCFR, however, even in the context of the fight against terrorism.  In this, it is arguable that the Court has taken a tougher stance than its Advocate General on this point of principle.  In this we see a mirror of the approach in DRI, when the Court took a different approach to its Advocate General.  In that case too, the Advocate General focussed on safeguards and the quality of law, as has the Advocate General here. For the Court here, differentiation – between people and between types of offences and threats – based on objective, evidenced grounds is central to showing that national measures are proportionate and no more than – in the terms of the directive – strictly necessary. This seems to go close to disagreeing with the Opinion of the Advocate General that in DRI, the Court ‘did not, however, hold that that absence of differentiation meant that such obligations, in themselves, went beyond what was strictly necessary’ (Opinion, para 199). The Advocate General used this point to argue that DRI did not suggest that mass surveillance was per se unlawful (see Opinion, para 205). Certainly, in neither case did the Court expressly hold that mass surveillance was per so unlawful, so the question still remains. What is clear, however, is that the Court supports the retention of data following justified suspicion – even perhaps generalised suspicion – rather than using the analysis of retained data to justify suspicion.

In its reasoning, the Court did not –unlike the Advocate General – specifically make a ruling on whether or not the safeguards set down in DRI, paras 60-68, should be seen as mandatory – in effect creating a 6 point check list. Nonetheless, it repeatedly cited DRI approvingly. Within this framework, it highlighted specific aspects – such as the need for prior approval; the need for security and control over data; a prohibition on transferring data outside the EU; the need for subjects to be able to exercise their right to a remedy. Some of these points will be difficult to reconcile with the current regime in the United Kingdom regarding communications data.

It did not, however, touch on acceptable periods for retention (even though it – like its Advocate General – referred to Zakharov). More generally, the Court’s analysis – by comparison with that of the Advocate General – was less detailed and structured, particularly about the meaning of necessity and proportionality. It did not directly address the points the Advocate General made about lawfulness, with specific reference to reliance on codes (an essential feature of the UK arrangements); it did in passing note that the conditions for access to data should be binding within the domestic legal system. Is this implicit agreement with the Advocate General on this point? It certainly agreed with him that the seriousness of the interference meant that data retention of communications data should be restricted to ‘serious crime’ and not just any crime.

One final issue relates to the judicial relationship between Strasbourg and Luxembourg.  Despite emphasising that the ECHR is not part of EU law, the Court relies on two recent cases from the ECtHR, perhaps seeking to emphasis the consistency in this area between the two courts – or perhaps seeking to put pressure on Strasbourg to hold the line as it faces a number of state surveillance cases on its own docket, many against the UK. The position of Strasbourg is significant for the UK. While many assume that the UK will maintain the GDPR after Brexit in the interests of ensuring equivalence, it could be that the EUCFR will no longer be applicable in the UK post-Brexit. For UK citizens, the ECHR then is the only route to challenge state intrusion into privacy. For those in the EU, data transfers to the UK post-Brexit could be challenged on the basis that the UK’s law is not sufficiently adequate compared to EU standards. Today’s ruling – and the UK’s response to it, if any – could be a significant element in arguing that issue.

‘I Travel, therefore I Am a Suspect’: an overview of the EU PNR Directive

ORIGINAL PUBLISHED ON  EU Immigration and Asylum Law and Policy BLOG

By Niovi Vavoula, Queen Mary University of London

According to the PNR (Passenger Name Record) Directive 2016/681 of 27 April2016, a series of everyday data of all air passengers (third-country nationals but also EU citizens, including those on intra-Schengen flights) will soon be transferred to specialised units to be analysed in order to identify persons of interest in relation to terrorist offences and other serious crimes. This new instrument raises once again fundamental rights challenges posed by its future operation, particularly in relation to privacy and citizenship rights. Therefore, the story of the PNR Directive, as described below, is probably not finished as such concerns open up the possibility of a future involvement of the Court of Justice.

1. The story behind the EU PNR System

In the aftermath of 9/11 and under the direct influence of how the terrorist attacks took place, the US legislature established inextricable links between the movement of passengers, ‘border security’ and the effective fight against international terrorism. Strong emphasis was placed on prevention through pre-screening of passengers, cross-checking against national databases and identification of suspicious behaviours through dubious profiling techniques. At the heart of this pre-emptive logic has been the adoption of legislation obliging airlines flying into the US to provide their domestic authorities with a wide range of everyday data on their passengers. These so-called PNR data constitute records of each passenger’s travel arrangements and contain the information necessary for air carriers to manage flight reservations and check-in systems. Under this umbrella definition, a broad array of data may be included: from information on name, passport, means of payment, travel arrangements and contact details to dietary requirements and requests for special assistance. Amidst concerns regarding the compliance of such mechanisms with EU privacy and data protection standards, this model was internalized at EU level through the conclusion of three PNR Agreements with the US – one in 2004, which wasstruck down by the CJEU in 2006, and others in 2007 and 2012. In addition, PNR Agreements with Canada (currently awaiting litigation before the CJEU) andAustralia have also been adopted.

The idea of developing a similar system to process EU air travel data had been on the agenda for almost a decade, particularly since the EU-US PNR Agreements contain reciprocity clauses referring to the possibility of the EU developing such systems. The first proposal for a Framework Decision dates back to 2007. However, no agreement was reached until the entry into force of the Lisbon Treaty. A revised proposal was released in 2011, essentially mimicking the EU-US PNR model, at least as regards the types of data to be processed and the focus on assessing the risks attached to passengers as a mean of preventing terrorist attacks or other serious crimes. In comparison to the proposed Framework Decision it constituted an improvement (for instance, it provided for a reduced retention period and prohibited the processing of sensitive data), yet it was met with great scepticism by a number of EU actors, including the European Data Protection Supervisor, the Fundamental Rights Agency and the Article 29 Working Party who argued that it failed to respect the principles of necessity and proportionality. Eventually, the proposal was rejected by the European Parliament on fundamental rights grounds, but the voting was postponed and the proposal was transferred back to the LIBE Committee.

The EU PNR project was brought back to life after the Charlie Hebdo events in January 2015. In the extraordinary JHA Council meeting of 20 November, immediately after the Paris terrorist attacks, the Council reiteratedthe urgency and priority to finalise an ambitious EU PNR before the end of 2015’. Indeed, on 4 December 2015 a compromise text was agreed. A few days later, the Council confirmed the agreement, but the Parliament did not give its blessing until April 2016, presumably in the light of the negotiations on the Data Protection legislative reforms, which were running in parallel. The fact that the legality of the EU-Canada PNR Agreement was disputed did not affect the course of the negotiations.

2. The EU PNR Directive in a nutshell

The EU PNR Directive places a duty on airline carriers operating international flights between the EU and third countries to forward PNR data of all passengers (as set out in Annex 1) to the Passenger Information Unit (PIU) established at domestic level for this purpose (Article 4). According to Article 2 of the Directive, Member States are given the discretion to extend the regime set out in the Directive to intra-EU flights, or to a selection of them (for a discussion see Council Documents 8016/11 and 9103/11, partly accessible). Perhaps unsurprisingly, all participating Member States have declared their intention to make use of their discretion.

Once transmitted, the data will be stored and analysed by the PIU. The purpose of this is to ‘identify persons who were previously unsuspected of involvement in terrorism or serious crime’ and require further examination by the competent authorities in relation to the offences listed in Annex II of the Directive. Contrary to the Commission’s assertions that PNR data will be used in different ways – reactively, pro-actively and in real-time – the focus on prevention is central. The analysis entails a risk assessment of all passengers prior to their travel on the basis of predetermined criteria to be decided by the respective PIU and possibly involving cross-checking with existing blacklists (Article 6(3)).

Furthermore, the PIUs will respond to requests by national authorities to access the data on a case-by-case basis and subject to sufficient indication (Article 6(2(b)). Nevertheless, processing should not take place on the basis of sensitive data revealing race, ethnic origin, religion or belief, political or any other opinion, trade union membership, health or sexual orientation etc. (Recital 20). According to Article 12, the initial retention period is six months, after which PNR data will be depersonalised, meaning that the PIU is entrusted with the task of masking out the names, address and contact information, payment information, frequent flyer information, general remarks and all API data. This process should not be confused with anonymisation, as the data could be re-identifiable and may still be used for criminal law purposes under ‘very strict and limited conditions’ (Recital 25). Therefore, upon expiry of the six-month retention period, disclosure of the full PNR data is permitted if so approved by a judicial authority or another national authority competent to review whether the conditions have been met and subject to information and ex post review by the Data Protection Officer of the PIU (Articles 12(3) and 5).

3. Privacy and surveillance of movement

The challenges that the development of the EU PNR system poses to the protection of privacy and data protection rights are acute. In essence, as with thePNR Agreements, the Directive allows the systematic, blanket and indiscriminate transfer, storage and further processing of a wide range of personal data of millions of travellers from and to the EU. Drawing from Digital Rights Ireland and the recent opinion of AG Mengozzi on the EU-Canada PNR Agreement, the interference with the rights to privacy (Article 7 EUCFR and 8 ECHR) and data protection (Article 8 EUCFR) is particularly serious. On the basis of the data collected, which include biographic information, credit card details and contact information, law enforcement authorities shall be able to compile a rather complete profile of travellers’ private lives.

The involvement of the private sector in the fight against terrorism and serious crime is considerably extended, particularly if one takes into account that the obligations on air carriers are extended to non-carrier economic operators (e.g. travel agencies). In addition, the inclusion of intra-EU flights within the scope of the Directive significantly expands the reach of surveillance. Indeed, back in 2011, it was noted that intra-EU flights represent the majority of EU flights (42%) followed by international flights (36%), and only 22% of flights operate within a single Member State (Council Document 8016/11). In this framework, the movement of the vast majority of travellers, including EU citizens, is placed under constant monitoring, irrespective of the fact that they are a priori innocent and not suspected of any criminal offence. In fact, the operation of the PNR scheme signifies the reversal of the presumption of innocence, whereby everyone is deemed as a potential security risk, thus necessitating their examination in order to confirm or rebut this presumption. Besides, there is no differentiation between flights transporting persons at risk and others.

Furthermore, the risk assessment will take place in a highly obscure manner, particularly since the Directive fails to prescribe comprehensively and in detail how the data will be analysed. The underlying rationale is the profiling of all passengers and the identifying of behavioural patterns in a probabilistic logic, but nowhere in the Directive it is indicated that this is indeed the case. This lack of clarity raises concerns considering that the recently adopted Data Protection Directive includes a definition of profiling (Article 3(4)). Moreover, it is stated that ‘relevant databases’ may be consulted, however, it is not clear which these are. For instance, a possible examination on a routine basis of the databases storing asylum seekers’ fingerprints’ or visa applicants’ data (Eurodac and VIS respectively) will frustrate their legal framework, resulting in a domino effect of multiple function creeps. It may even grow the appetite for Member States to desire the systematic processing of EU nationals’ personal data in centralised databases in the name of a more ‘efficient’ fight against terrorism.

This ambiguous modus operandi of PIUs may even call into question the extent to which the interference with privacy is ‘in accordance with law’ (Article 8(2) ECHR) or in EU terms ‘provided for by law’ (Article 52(1) EU Charter). According to settled case law of the ECtHR, every piece of legislation should meet the requirements of accessibility and foreseeability as to its effects (Rotaru v Romania). The lack of clear rules as to how the processing of data will take place may suggest that travellers cannot foresee the full extent of the legislation.

Another contested issue is the ambiguous definitions of terrorism and serious crimes at EU level. The offences falling under the remits of terrorism are currently revised, which had led to criticism for lack of clarity, whereas the definition of serious offences (acts punishable by a custodial sentence or detention order of a maximum period of three years or longer) constitutes a relatively low threshold, particularly in those Member States where domestic criminal law allows for potentially long custodial sentences for minor crimes. In addition, as regards the conditions of access by national competent authorities, the requirement that the request must be based on ‘sufficient indication’ seems to falls short of the criteria established in Digital Rights Ireland. The threshold is particularly low and may lead to generalised consultation by law enforcement authorities, whereas it is uncertain who will check that there is indeed sufficient indication. As for the offences covered by the scope of the Directive, although Annex II sets out a list in this regard, PNR data could still be used for other offences, including minor ones, when these are detected in the course of enforcement action further to the initial processing.

Moreover, in relation to the period for which the data will be retained, it appears that the EU institutions by no means have a clear understanding of what constitutes a proportionate retention period. For instance, the 2007 proposal envisaged an extensive retention period of five years, after which time the data would be depersonalised and kept for another eight years, whereas the 2011 proposal prescribed a significantly reduced initial retention period of 30 days, after which the data would be anonymised and kept for a further period of five years. In its General Approach (Council Document 14740/15), the Council called for an extension of the initial retention period to two years, followed by another three years of storage for depersonalised data. A more privacy-friendly approach can be found in an Opinion of the Council Legal Service dating from 2011, according to which the data of passengers in risky flights would be initially retained for 30 days and then be held for an overall period of six months (Council Document 8850/11in German). Some Member States supported a retention period of less than 30 days (Council Document 11392/11). Although it is welcomed that there are two sets of deadlines and, more importantly, that re-personalisation may take place only under limited circumstances. However, there is no indication of why the chosen retention periods are proportionate. Furthermore, an approach suggesting a differentiation between flights at risk or not at risk, with different retention periods, seems more balanced.

4. Free movement and citizenship concerns

In addition to the privacy challenges highlighted above, another point of concern is whether the processing of PNR data, including on intra-EU flights, could infringe free movement enjoyed by EU citizens. In this respect, the Commission Legal Service found that the EU PNR does not obstruct free movement (see Council Document 8230/11, which is partially available to the public, although the outcome of the opinion is attested in Council Document 8016/11). Nonetheless, the Parliament managed to include a reference that any assessments on the basis of PNR data shall not jeopardise the right of entry to the territory of the Member States concerned (in Article 4). The extent to which this reference is sufficient is doubtful.

According to Article 21 of the Schengen Borders Code, police controls performed in the territory of a Member State are allowed insofar as they do not have the equivalent effect of border control. Such an effect is precluded when, inter alia, the checks are carried out on the basis of spot-checks. In Melki, the CJEU found that ‘controls on board an international train or on a toll motorway’ limiting their application to the border region ‘might (…) constitute evidence of the existence of such an equivalent effect’ (para 72). By analogy, the focus on controls at the border area to the systematic manner set out in the directive, could have the equivalent effect of a border check. The lack of any differentiation between flights at risk or not at risk (an approach that was also favoured by the Council Legal Service, Council Document 8850/11) and the fact that member States are left entirely free to determine the extent to which they monitor flights to and from other Member States could enhance the risk of falling into the category of controls with an equivalent effect to border control.

5. Conclusion

The EU PNR Directive is yet another example of how the counter-terrorism rhetoric outweighs serious fundamental rights concerns in the name of ensuring security. The storyline is well-known: after a terrorist attack, numerous ideas – either incorporated in legislative proposals that have stalled or which were ultimately too ambitious and controversial to be presented in the first place – feature on the EU agenda. The EU PNR initiative was buried due to privacy concerns and was brought back from the dead when the circumstances matured. Soon national law enforcement authorities will put their hand into the passengers’ data jar and will deploy their surveillance techniques on an unprecedented and unpredictable scale.

By internalising US standards, the EU puts the privacy of individuals under threat. The new instrument does no longer target third-country nationals only, but also EU citizens, thus marking the end of an era where instruments were used ‘solely’ on foreigners. Undoubtedly, there is a strong ‘momentum’ for justifying mass surveillance practices. In waiting for the ruling on the EU-Canada PNR Agreement, as well as the ruling on Tele2 Sverige (following up on Digital Rights Ireland), one can only hope that the CJEU will uphold its inspiring reasoning and reiterate the important limits placed on deploying surveillance practices, by giving proper weight to the fundamental right to the protection of personal data.

OPINION 1/15: AG MENGOZZI LOOKING FOR A NEW BALANCE IN DATA PROTECTION

ORIGINAL PUBLISHED ON EUROPEAN LAW BLOG (OCTOBER 18, 2016)
By Maxime Lassalle
On 8 September 2016, Advocate General (AG) Mengozzi delivered his much awaited opinion on the agreement between Canada and the European Union on the transfer and processing of Passenger Name Record (PNR). It follows the European Parliament’s resolution seeking an Opinion from the Court of Justice of the European Union (CJEU) on the compatibility of the agreement with the Treaties. Even though the opinion concludes that the agreement has many loopholes, it could disappoint those who were expecting a strong condemnation of PNR schemes as such.

This blogpost intends to present the context of this procedure and the main elements of the AG’s opinion before analysing them. The question of the appropriate legal basis for the agreement, also raised by the Parliament, will not be addressed. However, before turning to the AG’s opinion, we need to briefly sketch the background of the proposed agreement.

The context

Today, in the absence of a PNR agreement with the EU, Canadian authorities apply their own PNR system unilaterally to air carriers established in the European Union (EU) which provide flights to Canada. This means that air carriers have to transfer PNR data (para. 7 of the AG’s opinion) to the extent that it is collected and contained in their automated reservation systems and departure control systems (para. 19). According to the Commission, the adoption of PNR systems is necessary to balance the legitimacy of the requests for PNR data in the fight against terrorism and the need to protect personal data of EU citizens from abusive access. As a result of the Lisbon Treaty, the adoption of PNR agreements now also requires the consent of the European Parliament (EP) (Article 218(6)(a)(v) of the Treaty on the Functioning of the European Union (TFEU)), and it is no secret that the EP is quite reluctant to adopt data retention schemes.

For a long time the EP has been requesting the Commission to provide for evidence that PNR schemes are necessary and in particular that the processing of Advance Passenger Information (API) would not be sufficient to reach the same objective of fighting terrorism and serious crime (for example here andhere). API are one of the 19 categories of PNR data and are limited to the identification of the travelers (name, date of birth, gender, citizenship, and travel document data) while PNR data encompass a much broader range of information (food habits, seating information etc.).

Nevertheless, the Commission ignored this request for evidence and proposed in 2013 a Council decision on the conclusion of a PNR agreement with Canada. This proposal was seriously criticized by the European Data Protection Supervisor (EDPS), also questioning the necessity of PNR schemes. Even though in the past, the Parliament had, albeit reluctantly, given its consent to similar PNR agreements (see the EU-US Agreement and the EU-Australia Agreement), this time it persisted and on 25 November 2014 it decided to refer the proposal on the agreement with Canada to the CJEU for it to assess the compatibility of this proposed agreement with the provisions of the TFEU and the Charter. Clearly, this move of the Parliament was inspired by the activism of the CJEU which had proved to be extremely demanding on the protection of personal data in the framework of the fight against terrorism in its famous Digital Rights Ireland case (DRI, commented on this blog).

The AG’s general considerations on PNR schemes

Let us now have a closer look at the (lengthy) opinion of the AG. Before analyzing the agreement, the AG assesses the intrusiveness of the PNR schemes as such, in relation to the right to data protection and the right to privacy. PNR data consist of 19 categories of personal data including data which ‘might provide information concerning, in particular, the health of one or more passengers, their ethnic origin or their religious beliefs’ (para. 169). The processing of these data therefore constitutes an interference which is of a ‘considerable size’ and ‘a not insignificant gravity’ (para. 176). This system is ‘capable of giving the unfortunate impression that all the passengers concerned are transformed into potential suspects’ (para. 176). However, the interference does not reach a level where the essence of the fundamental rights is harmed, because the PNR data do not permit to draw precise conclusions concerning ‘the essence of the private life of the persons concerned’ (para. 186). To justify the interference caused by the processing of PNR data, PNR schemes, should be properly provided for by law, such as an EU agreement adopted by the Council and approved by the EP (paras. 191-192), and meet an objective of general interest, namely the objective of combating terrorism and serious transnational crime (para. 194).

The AG’s general considerations on the standard to be applied to this unprecedented case

Following a classical reasoning on the assessment of the proportionality of the interference (see for example Schwarz, C‑291/12, para. 53), the AG explains that the proposed agreement ‘must also consist of the measures least harmful […] while making an effective contribution to the public security objective pursued by the agreement envisaged’. Provided that there are alternative measures which would be less intrusive, ‘those alternative measures must also be sufficiently effective’ in order to be considered as serious alternatives (para. 208). However, the definition of what is “sufficiently effective” is not given by the previous case law, neither that of the European Court of Human Rights (ECtHR) nor that of the CJEU. For the AG, the effectiveness of these alternative measures must ‘be comparable […] in order to attain the public security objective pursued by that agreement’ (para. 208). This standard of comparability is set by the AG himself. This was not evident as he could also have considered that less effective measures are still sufficiently effective. Requesting comparable effectiveness is a first. Usually in the reasoning, it is easy to decide whether there alternative measures are sufficiently effective or not (see for example Saint-Paul Luxembourg S.A. v. Luxembourg, para. 44). For measures of secret surveillance, it seems more difficult. The comparability criteria may be a way not to address a difficult question.

The AG acknowledges the ability of the interference to achieve the public security objective based on statistics communicated by the United Kingdom Government and the Commission concerning the Canadian authorities’ best practices (para. 205). Between April 2014 and March 2015, thanks to PNR data, 9,500 targets were identified, among them 1,765 persons were subjected to more thorough checks and 178 were arrested for a serious transnational criminal offence, connected in particular with drug trafficking (para. 262). However, the AG does not take into account that the statistics which were presented to the Court do not indicate the amount of data which was necessary to identify these targets. Moreover, one could note that according to the statistics no terrorist was identified, which is quite surprising for a scheme whose main purpose is precisely to identify people related to terrorism. The AG was obviously satisfied with the fact that PNR schemes are effective against organized crime.

The AG goes on addressing the specificity of PNR schemes, namely that it is their very nature to be based on profiling methods, by a comparison of the PNR data with scenarios or predetermined assessment criteria and that PNR data processing can lead to ‘false positive “targets” being identified’ (para. 255). This specificity of PNR schemes, which have never been assessed by the CJEU, made it necessary for the AG to detail the conditions under which PNR schemes could be considered as proportionate. In order to do so, he suggests to adapt a standard used by the ECtHR in Zakharov v. Russia, namely the standard of ‘reasonable suspicion’. For the AG, these procedures should manage to target ‘individuals who might be under a ‘reasonable suspicion’ of participating in terrorism or serious transnational crime’ (para. 256). The application of this standard is ambitious. Indeed, Judge Pinto de Albuquerque, in his dissenting opinion in Szabò and Vissy v. Hungary, had feared that this standard would be replaced by an ‘individual suspicion’, a lower standard, for surveillance measures whose purpose is to fight terrorism. However, this standard is used to limit the access to personal data by law enforcement authorities (an idea also present in the DRI case, para. 60-62). And yet the purpose of PNR schemes is not to create a pool of information available under strict conditions to law enforcement authorities, but to allow the Canadian competent authority, namely the Canada Border Services Agency, to use data mining procedures in order to discover new persons who were not previously suspected. Hence, the application of the standard of the ‘reasonable suspicion’ seems impossible as such: the limitation of the access to the data is not compatible with the idea, accepted by the AG, that PNR schemes need to process all the data that are available. The AG nevertheless tries to adapt the standard by proposing three principles.

The first principle is that the assessment criteria used to analyse the PNR data should not ‘be based on an individual’s racial or ethnic origin, his political opinions, his religion or philosophical beliefs, his membership of a trade union, his health or his sexual orientation’ (para. 258). The AG obviously fears discriminatory measures based on the processing of PNR data. The second principle, which is in line with the new principles proposed by Directive 2016/680 (i.e., the new Directive on data protection for police and criminal justice sector) is that the result of the automatic processing of data must be examined by non-automatic means (para. 259). The third principle is that the functioning of the automatic means should be checked regularly by an independent public authority (para. 259).

The AG’s proportionality test

After these general considerations, the AG starts his proportionality test. In the opinion nine points are considered separately (para. 210). From this analysis, three main elements deserve to be emphasized.

The first important point is that the AG accepts PNR schemes as a matter of principles. He considers that, excluding sensitive data, all categories of PNR data are considered relevant for the purpose of the envisaged agreement. Sensitive data are defined in Article 2 (e) of the envisaged agreement as ‘information that reveals racial or ethnic origin, political opinions, religious or philosophical beliefs, trade-union membership, or information about a person’s health or sex life’. The processing of sensitive data is allowed by the envisaged agreement but, for the AG, this is not acceptable as it creates a risk of stigmatization (para. 222). What is more, the fact that these data are excluded from the PNR agreement with Australia shows that the transfer of sensitive data is not necessary to pursue the objective of the scheme (para. 222). This appreciation of the AG is a direct consequence of the first of the three principles he established.

Still on the categories of data, the opinion brushes away the criticism of both the EP and the Article 29 data protection Working Party requesting evidence that the transfer of less data, for example only of API, is not sufficient to meet the objective of the proposed agreement. According to the AG, ‘data of that type does not reveal information about the booking methods, payment methods used and travel habits, the cross-checking of which can be useful for the purposes of combating terrorism and other serious transnational criminal activities. Independently of the methods used to process that data, the API data […] are therefore not sufficient to attain with comparable effectiveness the public security objective pursued by the agreement envisaged’ (para. 214).

Even though all these data are transferred to the Canadian authority irrespective of any indication that the persons concerned may have a connection with terrorism or serious transnational crime (para. 215), the purpose of PNR schemes is to identify persons who were ‘not known to the law enforcement services who may nonetheless present an ‘interest’ or a risk to public security’ (para. 216). For the AG, bulk transfers of data are then necessary. However, he considers the definition of certain categories of data as too vague. For example, heading 17 of the annex, on ‘general remarks’, covers all ‘supplementary information apart from that listed elsewhere in the annex to the agreement envisaged’ (para. 217). Consequently, it is likely that air carriers will transfer all the data that they own, and not only the data that are necessary for Canadian authorities (para. 220).

In addition, the AG’s opinion considers that the scope ratione personae of the agreement envisaged is not too broad and that the massive and indiscriminate transfer of personal data is necessary. If, in theory, it could be possible to imagine a PNR data transfer system which distinguishes passengers according to specific criteria, these systems would never be as effective as PNR data schemes in combating terrorism and serious transnational crime (para. 243). The AG also underlines that consumers of commercial flights voluntarily use a mode of transportation ‘which is itself, repeatedly, unfortunately, a vehicle or a victim of terrorism or serious transnational crime, which requires the adoption of measures ensuring a high level of security for all passengers’ (para. 242).

These first considerations are very important as they show that in principle, for the AG, massive transfer and processing of PNR data is not disproportionate as such. If the undifferentiated and general nature of the retention of the data of any person using electronic communications in the Union was one of the main reasons why Directive 2006/24/EC was considered as going beyond what was strictly necessary (para. 59 of the DRI case), such data retention schemes are possible as long as they respect strict conditions (see the opinion of AG Saugmandsgaard Øe on the joined cases Tele2 Sverige AB and Secretary of State for the Home Department, commented on this blog). The fact that AG Mengozzi accepts the principle of large scale transfer of PNR data is thus not so surprising.

Once this step was made and given the specificity of the case, he needed to create specific conditions under which PNR schemes are proportionate. In addition to the loopholes already explained, these conditions are further elaborated in the two remaining important points of the opinion.

The second important point is that the agreement envisaged should justify the duration of data retention. The AG regrets that the agreement envisaged ‘does not indicate the objective reasons that led the contracting parties to increase the PNR data retention period to a maximum of five years’ (para. 279). He adds that such a long period of retention of the data exceeds what is necessary, particularly because all the data are retained for the same duration (para. 284) and because the masking procedure is incomplete and does not fully ensure the depersonalization of the data (para. 287).

This point is significant as this is the only element in the AG’s opinion which is very critical of PNR schemes in general and which puts the PNR directive at risk. This question was also a key issue in the DRI case. In Directive 2006/24/EC the data retention period of a maximum of two years without distinguishing categories of data on the basis of their usefulness was not based on objective criteria and was therefore excessive (para. 64 of the DRI case). This threatens the validity of the PNR Directive. Indeed, Article 12 (1) of this Directive provides for a duration of five years, without distinguishing categories of data and explaining the reasons for such a long retention. Noticeably, its depersonalisation procedure seems more in line with the assessment of the AG, particularly because more data elements are masked (Article 12 (2) of the Directive, para. 287 of the AG opinion).

The last important point relates to the serious doubt of the AG concerning the level of protection granted by Canada. The opinion is indeed the most critical when it comes to the international nature of the agreement. This is not that surprising given that the Court recently adopted a very demanding position on bulk transfers of data to third countries (in the case Schrems, commented on this blog here). The AG acknowledges that the Court ‘cannot express a view on the legislation or the practice of a third country’ (para. 163). However, the terms of the agreement themselves should have been formulated in such a way that no discretion would be left to Canadian authorities as for the applicable level of protection (para. 164).

For the AG, the access to the data and the use of the transferred data by Canadian authorities is not sufficiently regulated in the envisaged agreement. It leaves to Canada the entire discretion to determine what officials and what competent authorities are allowed to access the data (paras. 250 and 267). Similarly, the envisaged agreement does not stick to a strict principle of purpose limitation as the processing of PNR data is not strictly limited to the fight against terrorism and serious crime (paras 236-237). This is aggravated by the fact that the offences which belong to the categories of terrorism and serious crime are not exhaustively listed (para. 235). Concerning the use of the data, the AG considers that the possibilities of disclosure and subsequent transfer of the PNR data is not sufficiently framed. Indeed, Articles 18 and 19 of the agreement envisaged allow the disclosure and subsequent transfer of the PNR data to other government authorities in Canada and could be used to circumvent the level of protection afforded in the EU (para. 296). As a matter of fact, no independent authority or judge would check the appreciation of the Canadian competent authority that the authority to which the data are transferred can afford an equivalent level of protection (para. 300). The AG concludes that all these points need to be more detailed in the agreement in order to make sure that the level of protection of data ensured in Canada is equivalent to the level of protection ensured in the European Union. Following the previous case law of the Court, particularly the DRI case, the level of protection ensured in the EU is quite demanding and the respect of same level of protection has to be ensured before transferring personal data to third countries (see in particular para. 96 in Schrems).

Finally, the AG points out that the mechanism for detection and review of any violations of the rules of the agreement envisaged affording protection of passengers’ privacy and personal data is not effective because it does not belong to a fully independent and impartial supervisory authority (para. 315). This last point reminds the Commission that the mechanisms of control in the third country must be insured by a sufficiently independent body. This reminder is interesting as the new ‘privacy shield’ replacing the safe harbor is criticized for providing a right to review only through an ombudsman whose independence and powers are questionable.

Some comments

In his reasoning, the AG addresses issues linked to the very nature of PNR schemes and the solutions he proposes do not threaten the principle of PNR schemes. Even though this opinion could seem at first disappointing for those who were expecting the AG to condemn PNR schemes, it appears that this ‘implicit acceptance’ of PNR schemes follows the general principles created by the Court but simply innovates and addresses the new issues that had not been addressed so far with more consideration for the necessity to provide for effective tools to fight terrorism and serious crime.

Even though a lot of questions had to be addressed by the AG, there is one which is of paramount importance. Ever since its DRI case, the Court has developed a strong focus on the guarantees concerning the access to personal data by law enforcement authorities and the AG had to adapt the requirements of the Court to PNR schemes. The attempt of the AG to adapt the standard of the ‘reasonable suspicion’ shows that the applicability of guarantees to law enforcement authorities’ access to data from different data retention schemes is a question which would deserve more attention. Generally speaking, the ECtHR considers that to assess the existence of a reasonable suspicion, it is necessary to check ‘whether there are factual indications for suspecting that person of planning, committing or having committed criminal acts or other acts that may give rise to secret surveillance measures, such as, for example, acts endangering national security’ (para. 260 of the case Zakharov v. Russia). The problem with PNR schemes is that the suspicion is not prior to the collection and processing of PNR data but discovered as a result of this collection and processing.

This question differs from the ones the Court has previously addressed in its case law, in particular in the DRI case. However, such an issue also exists in other areas. For instance, based on the European system of prevention of money laundering and terrorist financing, financial institutions have to monitor the transactions of all their clients and have the duty to report suspicious transactions. The control of suspicious transactions by these financial institutions also relies on mechanisms of data mining. The processing of personal data is made by private parties, namely financial institutions. Law enforcement authorities can in theory only obtain these data once financial institutions have reported a suspicion (this is, however, something that the Commission would like to change in order to facilitate the access to the data for the Financial Intelligence Units, see its proposal). Consequently, only the financial institutions, which collect anyways these data for the purpose of their economic activities and are subjected to the data protection framework provided for by Directive 95/46/EC, can access these data. This appears to be a safeguard against abusive access from law enforcement authorities. As a matter of fact, when law enforcement authorities access the personal data, after a report from a financial institution, there is already a degree of suspicion. This is probably more in line with the standard of ‘reasonable suspicion’. However, in this field, too, there is a massive collection of personal data which are analysed mainly through data mining procedures in order to discover suspicious transactions.

For PNR data, according to the agreement with Canada as well as for the new PNR Directive, air carriers companies do not have to analyse the data by themselves, but have to transfer all the data respectively to the Canada Border Services Agency or to the new ‘Passenger Information Units’ which will analyse all these data, through data mining procedures. From this data processing suspicions will then emerge which will be further analysed by law enforcement authorities.

Those two examples show that personal data are not only used a posteriori, once criminal investigations are open when a suspicion already exists but are also used for data mining processes with the purpose of discovering new suspicions. It might be that there is a difference based on whether private parties or public authorities are in charge of the data mining procedures. However, in both cases there is no previous ‘reasonable suspicion’; suspicions emerge following a massive monitoring of personal data.

At the end of the day, once the principle of massive surveillance schemes based on data mining mechanisms is considered to be acceptable as such, the standard of the ‘reasonable suspicion’ is overrun and has to be replaced by principles and other guarantees preventing any abuse, provided that this is possible. Are the three principles proposed by the AG sufficient? Hopefully the Court will address this key issue in a clear and detailed way.

Continue reading