The European legal framework on hate speech, blasphemy and its interaction with freedom of expression

Nota Bene : At the request of the European Parliament LIBE committee, this study provides an overview of the legal framework applicable to hate speech and hate crime on the one hand and to blasphemy and religious insult on the other hand. It also evaluates the effectiveness of existing legislation in selected Member States and explores opportunities to strengthen the current EU legal framework, whilst fully respecting the fundamental rights of freedom of expression and freedom of thought, conscience and religion. The study also provides the European Parliament with guidelines on dealing with hate speech within the EU institutions. Link to the full study (446 pages) AUTHORS (*)


Hate speech and hate crime incidents, including those committed online, are on the rise in Europe1, despite the existence of a robust legal framework. This study provides an overview of the legal framework applicable to hate speech and hate crime, as well as to blasphemy and religious insult. It also evaluates the effectiveness of existing legislation in selected Member States and explores opportunities to strengthen the current EU legal framework, whilst fully respecting the fundamental rights of freedom of expression and freedom of thought, conscience and religion. The study also provides the European Parliament with guidelines on dealing with hate speech within the EU institutions.

Legal framework on hate speech and hate crime

At the EU level the legal framework includes inter alia: Council Framework Decision 2008/913/JHA (CFD)2 (requiring Member States to penalise the most severe forms of hate speech and hate crime); and the Audiovisual Media Services (AMSD)3 and Electronic Commerce Directives (ECD)4 (controlling racist and xenophobic behaviours in the media and over the internet). It is important to view the EU measures aimed at addressing racism and xenophobia in the context of the broader EU legislative framework. Instruments aimed at supporting victims of crime and antidiscrimination measures are of particular relevance in this respect. These include Directive 2012/29/EU5 (Victims’ Support Directive) and the EU’s equality and anti-discrimination legislation (e.g. Directive 2000/43/EC6 (the Racial Equality Directive)). The Racial Equality Directive is complemented by other antidiscrimination legislative instruments such as Directive 2000/78/EC7 (the Employment Equality Directive) and Directives 2004/113/EC and 2006/54/EC8 (the Equal Treatment Directives). The EU also provides its support in practice by financing projects aimed inter alia at fighting hate speech and hate crime (for example under the Europe for Citizens Programme 2014-20209 or the Rights, Equality and Citizenship Programme 2014-202010).

The current study, developed on the basis of information gathered through seven national studies (Belgium, Germany, Greece, France, Hungary, the Netherlands and Sweden), has revealed some major drawbacks of the current legal framework applicable to hate speech and hate crime:

Shortcomings related to the transposition of the CFD include its incomplete transposition. Gaps in transposition mainly arise in connection with Article 1(1)(c) and 1(1)(d) of the CFD requiring the penalisation of the condoning, denial or gross trivialisation of genocide, crimes against humanity and war crimes and of Nazi crimes, respectively. To ensure effective protection against the most severe forms of hate speech and hate crime, it is recommended that the European Commission (EC) initiates infringement proceedings against Member States failing to transpose the CFD. Another issue derives from the transposition of the protected characteristics (grounds upon which hate speech and hate crime are prohibited) set out in the CFD, the AMSD and the ECD. As a general rule, Member States’ legislation refers to characteristics beyond those required by the CFD, the AMSD and the ECD. Member States have not taken a harmonised approach in this respect, thus the list of protected characteristics varies from Member State to Member State. Therefore an ambitious review of existing EU law might be necessary.

The use in practice of the CFD, the AMSD and the ECD is hindered by similar factors. Member States fail to collect sufficient reliable data on hate speech and hate crime incidents, which hinders the monitoring and assessment of the scale of the problem. This mainly results from the fact that data collection related competences are often divided between more than one authority, whose data collection efforts are not harmonised. To overcome the existing data gap, Member States with less developed or harmonised data collection methods could be encouraged to learn from Member States with good practices in place. The underreporting of hate speech and hate crime incidents by victims also hinders the understanding of the scale of the problem. Member States could be encouraged to raise awareness of the means of reporting incidents or to facilitate reporting through alternative means, such as anonymously, through the internet or victim support organisations.

The absence of shared understanding by practitioners of the applicable legal provisions seems to be an issue across the globe. The provision of clear guidance to practitioners, for example through awareness raising materials or training programmes, is therefore needed. These tools should provide practitioners with the skills necessary to duly investigate, prosecute and adjudicate hate speech and hate crime incidents.

In addition, applicable rules often fail to cover the liability of operators for the publication of hate content by bloggers or users of social media sites. The liability of bloggers and users of websites is often regulated; however these individuals are sometimes difficult to trace back, moreover it is often difficult to prove their motivation. The situation is an issue of concern given that internet remains a critical tool for the distribution of racist and hateful propaganda. To overcome the potential impunity of offenders it is recommended to regulate the liability of operators, thereby encouraging them to better control the content of blogs and social media websites. Alternatively Member States could reinforce their efforts of monitoring the content of websites. This however, should be done in a manner ensuring the sufficient respect of freedom of expression.

In most Member States, no concerns have arisen regarding the unnecessary limitation of freedom of expression by hate speech legislation, or vice versa. France constitutes an exception in this respect where debates over the borderline between the protection of human dignity and the freedom of expression have recently reignited, when the French Government announced its new campaign against online hate speech. Some considered the French measures as too restrictive of the freedom of expression11. Guidance on where the borderline stands between the two fundamental rights is found in the case law of the European Courts of Human Rights (ECtHR). The ECtHR has ruled that in a democratic society, which is based on pluralism, tolerance and broadmindedness, freedom of expression should be seen as a right extending also to information and ideas that might offend, shock or disturb others. Any limitation of the freedom of expression must be proportionate to the legitimate aim pursued12. Member States could also be encouraged to sign and ratify the Council of Europe’s (CoE) Additional Protocol to the Convention of Cybercrime13, which gives due consideration to freedom of expression, while requiring the criminalisation of racist and xenophobic acts committed online.

Finally, the absence of one comprehensive policy dealing with hate speech and hate crime is itself a matter that should be addressed. This could be addressed through the adoption of a comprehensive strategy for fighting hate speech and hate crime. The Strategy could define concrete policy goals for the Member States, targeting the most severe forms of hate speech and hate crime, including online crime. These policy goals could be set in light of the most important factors hindering the application of hate speech and hate crime legislation in practice. These factors, as explained in details above, include inter alia the insufficient transposition of applicable rules, the inadequate knowledge of practitioners of the rules applicable to hate speech and hate crime, the insufficient data collection mechanisms in place and the existence of severe underreporting. The Strategy should ensure the sufficient respect of freedom of expression and acknowledge that hate speech and hate crime are present in all areas of life (e.g. politics, media, employment).

Legal framework on blasphemy and religious insult Continue reading

EP Study : Big Data and smart devices and their impact on privacy

AUTHORS : Dr  Gloria  González Fuster, (Research  Professor  at  the Vrije Universiteit  Brussel  (VUB), Dr Amandine Scherrer, (European Studies Coordinator and Associate Researcher at the Centre d’Etudes sur les  Conflits,  Liberté  et  Sécurité -CCLS)


EU citizens and residents and, more generally, all individuals deserving protection as ‘data subjects’ by EU law, are directly impacted by EU strategies in the field of Big Data. Indeed, the data-driven economy poses significant challenges to the EU Charter of Fundamental Rights, notably  in  the fields of  privacy and  personal data protection.

Big Data refers to the exponential growth both in the availability and automated use of information. Big Data comes from gigantic digital datasets held by corporations, governments and other large organisations; these are extensively analysed (hence the name ‘data analytics’) through computer algorithms. There are numerous applications of Big Data in various sectors, including healthcare, mobile communications, smart grids, traffic management, fraud detection, or marketing and retail (both on- and offline). The notion, primarily driven by economic concerns, has been largely promoted through market-led strategies and policies. Presented as an enabler of powerful analytical and predictive tools, the concept of Big Data has also raised numerous criticisms emphasising such risks as biased information, spurious correlations (associations that are statistically robust but happen only by chance), and statistical discrimination. Moreover, the promotion of Big Data as an economic driver raises significant challenges for privacy and digital rights in general. These challenges are even greater in a digital ecosystem with a proliferation of cheap sensors, numerous apps on mobile devices and an increasingly connected world that sometimes does not even require human intervention (as shown in the increasing development of the Internet of Things [IoT]). The flows of information on- and off line, shared and multiplied across computers, mobile devices, watches, SmartBands, glasses, etc., have dramatically increased the availability, storage, extraction and processing of data on a large scale. It has become increasingly difficult to track what is made of our data. This situation is complicated further by the wide variety of actors  engaged  in  data  collection  and  processing.

The numerous debates triggered by the increased collection and processing of personal data for various – and often unaccountable – purposes are particularly vivid at the EU level. Two interlinked, and to some extent conflicting, initiatives are relevant here: the development of EU strategies promoting a data-driven economy and the current reform of the EU personal data protection legal framework, in the context of the adoption of a General   Data  Protection  Regulation  (GDPR).

In order to address the issues at stake, the present Study provides an overview of Big Data and smart devices, outlining their technical components and uses (section 2). This section shows that many contemporary data processing activities are characterised by a high degree of opacity. This opacity directly affects the ability of individuals to know how data collected about them is used; it also hinders their capacity to assess and trust the manner in which choices are (automatically) made – whether, in other words, these choices are appropriate or fair. As regards smart devices, cheap sensors or the IoT, the pervasiveness of sensors and extensive routine data production might not be fully understood by individuals, who may be unaware of the presence of sensors and of the full spectrum of data they produce, as well as the data processing operations treating this diverse data. If Big Data, smart devices and IoT are often promoted as key enablers of market predictions and economic/social dynamics, data processing raises the question of who  controls one’s  data.

In this perspective, Section 3 presents the different EU approaches on the digital economy and the questions raised in terms of privacy and personal data protection (Section 3). This section argues that in the current context of the development of a Digital Single Market for Europe (DSM), the European Commission’s perspective is very much commercially and economically driven, with little attention to the key legal and social challenges regarding privacy and personal data protection. Even though the European Commission points out some of the key challenges of processing data for economic and market purposes (i.e., anonymisation, compatibility, minimisation), the complexity of these challenges is somehow under-estimated. These challenges can be grouped around the following questions any digital citizen may ask her/himself under EU law: which data about me are collected and for what purposes? Are data protected from unauthorised access and to  what  extent  is  control  exercised  upon  the processing  of my  personal   data?

Section 4 then considers these questions in the specific context of the Data Protection Reform package. Arguing that the digital citizens rights should be the main focus of the current debates around the GDPR, this Section underlines that Big Data, smart devices and the IoT reveal a series of potential gaps in the EU legal framework, in the following areas in particular: transparency and information obligations of data controllers; consent (including consent in case of repurposing); the need to balance public interest and the interests of data subjects for legitimising personal data processing; the regulation of profiling; and proper safeguarding of digital rights in case of data transfers to  third  parties and  third  countries.

In light of these findings, the Study concludes with key recommendations for the European Parliament and, in particular, the LIBE Committee responsible for the protection of natural persons with regards to the processing of personal data. These recommendations aim at ensuring that negotiations around the GDPR promote a strong and sustainable framework  of  transparency  and  responsibility  in which  the data  subject’s rights  are  central.

In particular, the guiding principle of any exploitation of personal data should be driven by the requirement of guaranteeing respect for the Fundamental Rights (privacy  and  personal  data protection) laid  down  in EU primary  and secondary  law (recommendations 1 & 2).

The role of data controllers in this perspective is central as they are legally required to observe a number of principles when they process personal data, compliance of which must be reinforced. The degree of information and awareness of data subjects must be of prime concern whenever personal data processing takes places, and the responsibility for protecting Fundamental Rights should be promoted along the data production chain and gather various stakeholders. Furthermore, the GDPR should ensure that individuals are granted complete and effective protection in the face of current   and   upcoming   technological   developments   of   Big   Data   and   smart   devices (recommendation 3).

The GDPR currently under discussion should in any case not offer less protection and guarantees than the 1995 Data Protection Directive, and users should remain in complete control of their personal data throughout the data lifecycle.

Finally, effective protection of individuals cannot be guaranteed solely by the adoption of a sound GDPR. It will also require a consistent review of the e-Privacy Directive (recommendation 4), an instrument that not only pursues the safeguarding of personal data protection but, more generally, aims to ensure this right and the right to respect for private life.

How the EU “legislative triangle” is becoming a “Bermudes, triangle “…

by Emilio De Capitani

According to several scholars the Lisbon Treaty has strengthened the implementation of the democratic principle in the EU as well as the framework for participative democracy. In theory with entry into force of the Charter the EU has become more accountable to its citizens and there has been a clear improvement of the legal framework for EU legislative and non legislative activity. Even if not perfectly sound) there is now a clear definition of what should be considered of “legislative” nature and there is now a clear obligation (at primary law level) to debate publicly both in the Council and in the European Parliament.

Needless to say, the latter has been for years the champion of legislative and administrative transparency  not only in the citizens interest but also in view of the definition of its own marge of maneuver during the negotiations with the Council. This former EP attitude was not particularly appreciated by the Council and the Commission when in 2001, before Lisbon, the three institutions negotiated the first EU legislation in this domain. (Regulation 1049/01). However at the time it was easy to say that time was needed to promote open debates and votes in the Council and in the Commission because it would had required a change of culture in an institution mainly structured as a bureaucratic machinery (the Commission) or in an other framed by a diplomatic approach (the Council).

Five years after Lisbon such a change of culture in the Council and the Commission is it under way or is the other way round for the EP?

Have a look to the exchange of messages below and make your own opinion. The issue is still pending but risks to have some interesting developments… Continue reading



by Lorna Woods, Professor of Media Law, University of Essex

When can freedom of expression online be curtailed? The recent judgment of the Grand Chamber of the European Court of Human Rights in Delfi v. Estonia has addressed this issue, in the particular context of comments made upon a news article. This ruling raises interesting questions of both human rights and EU law, and I will examine both in turn.

The Facts

Delfi is one of the largest news portals in Estonia. Readers may comment on the news story, although Delfi has a policy to limit unlawful content, and operates a filter as well as a notice and take down system. Delfi ran a story concerning ice bridges, accepted as well-balanced, which generated an above average number of responses. Some of these contained offensive material, including threats directed against an individual known as L.

Some weeks later L requested that some 20 comments be deleted and damages be paid. Delfi removed the offending comments the same day, but refused to pay damages. The matter then went to court and eventually L was awarded damages, though of a substantially smaller amount than L originally claimed. Delfi’s claim to be a neutral intermediary and therefore immune from liability under the EU’s e-Commerce Directiveregime was rejected. The news organisation brought the matter to the European Court of Human Rights and lost the case in a unanimous chamber decision. It then brought the matter before the Grand Chamber.

The Grand Chamber Decision Continue reading

Europe and “Whistleblowers” : still a bumpy road…

by Claire Perinaud (FREE Group trainee) The 9th and the 10th of April was organized in Paris by the University Paris X Nanterre la Défense in collaboration with the University Paris I Sorbonne a Conference on «  whistleblowers and fundamental rights »[1] which echoed a rising debate on the figure of  wistleblowers  after the numerous revelations of scandals and corruption which occurred last years, with some of them directly linked to EU institutions. In the following lines I will try to sketch a) the general framework then b) the main issues raised during the Conference

A) The general framework 

The term « whistle-blower » was created by Ralph Nader in 1970 in the context of the need to ensure the defense of citizens from lobbies. He defined « whistle blowing » as « an act of a man or woman who, believing that the public interest overrides the interest of the organization he serves, blows the whistle that the organization is in corrupt, illegal, fraudulent or harmful activity »[2]. The interest of scholars and lawyers to the figure of whistle-blowers in the United States dates back to the adoption by the Congress in 1863 of the False claims act which is deemed to be the first legislation related to the right of alert[3].
The system which developed afterwards is notably based on the idea that whistle-blowing is a strong mechanism to fight corruption and has to be encouraged by means of financial incentives[4]. If this mechanism is of utmost importance in the United States, protection of whistle blowers is only slowly introduced in Europe[5]
With numerous scandals related to systemic violations of human rights, the subject is progressively dealt with in the European Union (EU) and in the Council of Europe. Nevertheless, in both organizations, the protection of whistleblowers remain at the stage of project or only recommendations to the states.

The Council of Europe… Continue reading

Do Facebook and the USA violate EU data protection law? The CJEU hearing in Schrems

Sunday, 29 March 2015
by Simon McGarr, solicitor at McGarr solicitors (*)

Last week, the CJEU held a hearing in the important case of Schrems v Data Protection Commissioner, which concerns a legal challenge brought by an Austrian law student to the transfers of his personal data to the USA by Facebook, on the grounds that his data would be subject to mass surveillance under US law, as revealed by Edward Snowden. His legal challenge was actually brought against the Irish data protection commissioner, who regulates such transfers pursuant to an agreement between the EU and the US known as the ‘Safe Harbour’ agreement. This agreement takes the form of a Decision of the European Commission made pursuant to the EU’s data protection Directive, which permits personal data to be transferred to the USA under certain conditions. He argued that the data protection authority has the obligation to suspend transfers due to breaches of data protection standards occurring in the USA. (For more detail on the background to the case, see the discussion of the original Irish judgment here).

The following summarises the arguments made at the hearing by the parties, including the intervening NGO Digital Rights Ireland, as well as several Member States, the European Parliament, the Commission and the European Data Protection Supervisor. It then sets out the question-and-answer session between the CJEU judges (and Advocate-General) and the parties. The next step in this important litigation will be the opinion of the Advocate-General, due June 24th.

Please note: these notes are presented for information purposes only. They are not an official record or a verbatim account of the hearing. They are based on rough contemporaneous notes and the arguments made at the hearing are paraphrased or compressed. Nothing here should be relied on for any legal or judicial purpose, and all the following is liable to transcription error.

Schrems v Data Protection Commissioner
Case C-362/14
M.V Skouris (president); M.K. Lenaerts (Vice President); M.A. Tizzano; Mme R. Silva de Lapuerta; M. T. Von Danwitz (Judge Rapporteur); M. S. Rodin; Mme K. Jurimae; M. A Rosas; M. E. Juhász; M. A. Borg Barthet; M. J. Malenovsky; M. D. Svaby; Mme M. Berger; M. F. Biltgen; M. C. Lycourgos; M. F. Biltgen
M. Y. Bot (Advocat General)

Max Schrems

Noel Travers SC for Mr. Schrems told the court that personal data in the US is subject to mass and indiscriminate mass surveillance. The DRI v Ireland case struck down the EU data retention directive, establishing a principle which applies a fortiori to this case. However, the court held that Data Retention did not affect the essence of the right under Article 8, as it concerned only metadata. The surveillance carried out in the US accesses the content of data as well as the metadata, and without judicial oversight. This interference is so serious that it does violate the essence of Article 8 rights, unlike the data retention directive. Mr. Travers held that the Safe Harbour decision is contrary to the Data Protection directive’s own stated purpose, and that it was accordingly invalid.
Answering the Court’s question as to whether the decision precludes an investigation by a Data Protection Authority (DPA) such as the Irish Data Protection Commissioner, he submitted that compliance with fundamental rights must be part of the implementation of any Directive. Accordingly, national authorities, when called upon in a complaint to investigate breaches must have the power to do so.
Article 25.6 of the data protection Directive allows for findings on adequacy regarding a third country “by reason of its domestic law or of the international commitments it has entered into”. The Safe Harbour Principles (SHPs) and FAQs are not a law or an international agreement under the meaning of the Vienna Convention. And the SHPs do not apply to US public bodies. The Safe Harbour Principles are set out in an annex to a Commission Decision, but that annex is subject to US courts for interpretation and for compliance. Where there is a requirement for compliance with law, it is with US law, not EU law.

Irish Data Protection Commissioner

For the Data Protection Commissioner, Mr. Paul Anthony McDermott said that with power must come limitations. All national regulators are firstly bound by domestic law. The Data Protection Commissioner is also bound by the Irish Constitutional division of powers. She cannot strike down laws, Directives or a Decision.
Mr. Schrems wanted to debate Safe Harbour in a general way- it wasn’t alleged then that Facebook was in breach of safe harbour or that his data was in danger. The Irish High Court had a limited Judicial Review challenge in front of it. Mr. Schrems didn’t challenge Safe Harbour, or the State, or EU law directly, and the Irish High Court declined the application by Digital Right Ireland to refer the validity of the Safe Harbour Decision to Luxembourg. Mr. McDermott asked the court to respect the parameters of the case.
Europe has decided to deal with the transfer of data to the US at a European level. The purpose of the Safe Harbour agreement is to reach a negotiated compromise. The words “negotiate”, “adapt” and “review” appear in the Decision. It is clear therefore that a degree of compromise is envisaged. Such matters are not to be dealt with in a court but, as they involve both legal and political issues, by diplomacy and realpolitik.
The Data Protection Commissioner can have regard to the EU Charter of Fundamental Rights when she’s balancing matters but it doesn’t trump everything. It doesn’t allow her to ignore domestic law or European law, Mr. McDermott concluded. Continue reading



By Sabine Jacques

In mid-January, Julia Reda (Pirate Party MEP) communicated a draft of her report on the implementation of the Information Society Directive (‘InfoSoc Directive) 2001/29/EC (it’s lengthy, but a summary can be found here). Described as ‘the most progressive official EU document on copyright since the first cat picture was published on the web’, but also as being ‘surprisingly extreme’ and even being ‘inacceptable, this report attracted widespread interest and statements of support from different digital rights organisations.

While the report rightly urges for an ever-increasing ‘internet-friendly copyright law’, the report might have gone too far in relation to parodies. Article 5.3(k) of the InfoSoc Directive currently provides the possibility for EU Member States to introduce a parody exception for the purposes of parody, pastiche and caricature to the exclusive right of reproduction in their national copyright laws (this opportunity was seized by the UK which now includes a parody exception in section 30A CDPA). This provision was interpreted by the Court of Justice of the European Union in the Deckmyn case, guiding national courts in their application of the exception to particular facts (for comments on this decision see here and the AG’s opinion see here).

At 17 on page 6 of the report, Julia Reda suggests ‘that the exception for caricature, parody and pastiche should apply regardless of the purpose of the parodic use’. Without further explanations, such a broad exception raises concerns.

The parody exception is an exception to the right-holder’s exclusive right of reproduction. As such, international treaties subject it to the application of the three-step test (Berne Convention art. 9(2), TRIPS Agreement arts. 9(1) and 13; and, WCT arts. 1(4) and 10). This test requires any exceptions in national legislation to be limited to ‘certain special cases, provided that such reproduction does not conflict with a normal exploitation of the work and does not unreasonably prejudice the legitimate interests of the author’. The French authorities’ response appropriately expresses concerns that a parody exception applicable outside any purpose of parody is unlikely to meet the first step of ‘certain special cases’. As this requirement means that a shapeless provision exempting broad series of uses should not be tolerable and reflects the need for legislators to reconcile opposing interests.

The exception for the purpose of parody, caricature or pastiche aims to provide the possibility for parodists to copy copyrighted works in limited circumstances. The current parody exception is the result of a compromise in light of the objectives underlying the exception. The issue opposes the interests of right-holders (who are entitled to be rewarded for their creation) against the interest of the users (who need to reproduce prior works to create the new work). Removing its purpose is likely to amount to a shapeless exception rebuffed by international obligations.

Yet, La Quadrature du Net interprets Julia Reda’s proposal as: ‘to admit the parody exception for non-humorous creations’. If this is her aim, this could be achieved through the current wording of the exception for the purpose of parody.

The Court of Justice of the European Union has defined ‘parody’ through its requirements in Deckmyn. At para 20, the Court notes that a parody needs: ‘to evoke an existing work while being noticeably different from it, and, secondly, to constitute an expression of humour or mockery’.

The expression of humour or mockery does not exclude the expression of criticisms. By requiring the parodist to have a humorous intent, it is suggested that a broad interpretation should prevail as to include playful, homage or serious expressions (a glimpse at French case law which knows a long history of the application of the parody exception shows evidence of serious expressions and the inclusion of satire). The limit being that the expression should refrain from being prejudicial to the person of the author or his work(s). The failure to meet this requirement enables the right-holder to enforce his or her moral rights (especially the integrity right). Additionally, where an individual is defamed, this person can bring an action under defamation law.

Also, the primary justification to the introduction of a parody exception is to facilitate the exercise of one’s freedom of expression. While freedom of expression is already considered in the current InfoSoc Directive (Recital 3 reads: ‘The proposed harmonisation will help to implement the four freedoms of the internal market and relates to compliance with the fundamental principles of law and especially of property, including intellectual property, and freedom of expression and the public interest.’) and the interpretation of the parody exception in Deckmyn (at para 25), the report (recitals C and D) confirms the importance of the relationship between copyright and related rights and freedom of expression both protected under the Charter of Fundamental Rights of the European Union (respectively enshrined in article 17(2) and 11).

Yet, the concerns expressed by Julia Reda concerning the likelihood of achieving harmonisation of the exceptions throughout the EU territory under the current InfoSoc Directive (at 10) are shared. Additionally, her wish to make copyright exceptions mandatory is welcomed (at 11) and would certainly contribute to the objective of harmonisation desired.

To conclude, it must be reminded that this report is merely a draft. This one will now be handed over to the Legal Affairs Committee and to the Internal Market and Culture committees. Overall, the report makes important proposals but there is still room for improvement. Against this backdrop, care must be taken regarding the details of each provision such as for the parody exception to ensure that the impact of the exception applicable outside parody uses does not disrupt the balance desired between the interests of right-holders and parodists.