FULL STUDY ACCESSIBLE HERE
AUTHORS : Dr Gloria González Fuster, (Research Professor at the Vrije Universiteit Brussel (VUB), Dr Amandine Scherrer, (European Studies Coordinator and Associate Researcher at the Centre d’Etudes sur les Conflits, Liberté et Sécurité -CCLS)
EXECUTIVE SUMMARY
EU citizens and residents and, more generally, all individuals deserving protection as ‘data subjects’ by EU law, are directly impacted by EU strategies in the field of Big Data. Indeed, the data-driven economy poses significant challenges to the EU Charter of Fundamental Rights, notably in the fields of privacy and personal data protection.
Big Data refers to the exponential growth both in the availability and automated use of information. Big Data comes from gigantic digital datasets held by corporations, governments and other large organisations; these are extensively analysed (hence the name ‘data analytics’) through computer algorithms. There are numerous applications of Big Data in various sectors, including healthcare, mobile communications, smart grids, traffic management, fraud detection, or marketing and retail (both on- and offline). The notion, primarily driven by economic concerns, has been largely promoted through market-led strategies and policies. Presented as an enabler of powerful analytical and predictive tools, the concept of Big Data has also raised numerous criticisms emphasising such risks as biased information, spurious correlations (associations that are statistically robust but happen only by chance), and statistical discrimination. Moreover, the promotion of Big Data as an economic driver raises significant challenges for privacy and digital rights in general. These challenges are even greater in a digital ecosystem with a proliferation of cheap sensors, numerous apps on mobile devices and an increasingly connected world that sometimes does not even require human intervention (as shown in the increasing development of the Internet of Things [IoT]). The flows of information on- and off line, shared and multiplied across computers, mobile devices, watches, SmartBands, glasses, etc., have dramatically increased the availability, storage, extraction and processing of data on a large scale. It has become increasingly difficult to track what is made of our data. This situation is complicated further by the wide variety of actors engaged in data collection and processing.
The numerous debates triggered by the increased collection and processing of personal data for various – and often unaccountable – purposes are particularly vivid at the EU level. Two interlinked, and to some extent conflicting, initiatives are relevant here: the development of EU strategies promoting a data-driven economy and the current reform of the EU personal data protection legal framework, in the context of the adoption of a General Data Protection Regulation (GDPR).
In order to address the issues at stake, the present Study provides an overview of Big Data and smart devices, outlining their technical components and uses (section 2). This section shows that many contemporary data processing activities are characterised by a high degree of opacity. This opacity directly affects the ability of individuals to know how data collected about them is used; it also hinders their capacity to assess and trust the manner in which choices are (automatically) made – whether, in other words, these choices are appropriate or fair. As regards smart devices, cheap sensors or the IoT, the pervasiveness of sensors and extensive routine data production might not be fully understood by individuals, who may be unaware of the presence of sensors and of the full spectrum of data they produce, as well as the data processing operations treating this diverse data. If Big Data, smart devices and IoT are often promoted as key enablers of market predictions and economic/social dynamics, data processing raises the question of who controls one’s data.
In this perspective, Section 3 presents the different EU approaches on the digital economy and the questions raised in terms of privacy and personal data protection (Section 3). This section argues that in the current context of the development of a Digital Single Market for Europe (DSM), the European Commission’s perspective is very much commercially and economically driven, with little attention to the key legal and social challenges regarding privacy and personal data protection. Even though the European Commission points out some of the key challenges of processing data for economic and market purposes (i.e., anonymisation, compatibility, minimisation), the complexity of these challenges is somehow under-estimated. These challenges can be grouped around the following questions any digital citizen may ask her/himself under EU law: which data about me are collected and for what purposes? Are data protected from unauthorised access and to what extent is control exercised upon the processing of my personal data?
Section 4 then considers these questions in the specific context of the Data Protection Reform package. Arguing that the digital citizen’s rights should be the main focus of the current debates around the GDPR, this Section underlines that Big Data, smart devices and the IoT reveal a series of potential gaps in the EU legal framework, in the following areas in particular: transparency and information obligations of data controllers; consent (including consent in case of repurposing); the need to balance public interest and the interests of data subjects for legitimising personal data processing; the regulation of profiling; and proper safeguarding of digital rights in case of data transfers to third parties and third countries.
In light of these findings, the Study concludes with key recommendations for the European Parliament and, in particular, the LIBE Committee responsible for the protection of natural persons with regards to the processing of personal data. These recommendations aim at ensuring that negotiations around the GDPR promote a strong and sustainable framework of transparency and responsibility in which the data subject’s rights are central.
In particular, the guiding principle of any exploitation of personal data should be driven by the requirement of guaranteeing respect for the Fundamental Rights (privacy and personal data protection) laid down in EU primary and secondary law (recommendations 1 & 2).
The role of data controllers in this perspective is central as they are legally required to observe a number of principles when they process personal data, compliance of which must be reinforced. The degree of information and awareness of data subjects must be of prime concern whenever personal data processing takes places, and the responsibility for protecting Fundamental Rights should be promoted along the data production chain and gather various stakeholders. Furthermore, the GDPR should ensure that individuals are granted complete and effective protection in the face of current and upcoming technological developments of Big Data and smart devices (recommendation 3).
The GDPR currently under discussion should in any case not offer less protection and guarantees than the 1995 Data Protection Directive, and users should remain in complete control of their personal data throughout the data lifecycle.
Finally, effective protection of individuals cannot be guaranteed solely by the adoption of a sound GDPR. It will also require a consistent review of the e-Privacy Directive (recommendation 4), an instrument that not only pursues the safeguarding of personal data protection but, more generally, aims to ensure this right and the right to respect for private life.