Article 17 of the Directive on Copyright in the Digital Single Market: A Fundamental Rights Assessment

2020 ◽  
Author(s):  
Julia Reda ◽  
Joschka Selinger ◽  
Michael Servatius
2021 ◽  
Author(s):  
Christophe Geiger ◽  
Bernd Justin Jütte

Abstract The Directive on Copyright in the Digital Single Market (CDSM Directive) introduced a change of paradigm with regard to the liability of some platforms in the European Union. Under the safe harbour rules of the Directive on electronic commerce (E-Commerce Directive), intermediaries in the EU were shielded from liability for acts of their users committed through their services, provided they had no knowledge of it. Although platform operators could be required to help enforce copyright infringements online by taking down infringing content, the E-commerce Directive also drew a very clear line that intermediaries could not be obliged to monitor all communications of their users and install general filtering mechanisms for this purpose. The Court of Justice of the European Union confirmed this in a series of cases, amongst other reasons because filtering would restrict the fundamental rights of platform operators and users of intermediary services. Twenty years later, the regime for online intermediaries in the EU has fundamentally shifted with the adoption of Art. 17 CDSM Directive, the most controversial and hotly debated provision of this piece of legislation. For a specific class of online intermediaries known as ‘online content-sharing providers’ (OCSSPs), uploads of infringing works by their users now result in direct liability and they are required undertake ‘best efforts’ to obtain authorization for such uploads. With this new responsibility come further obligations which oblige OCSSPs to make best efforts to ensure that works for which they have not obtained authorization are not available on their services. How exactly OCSSPs can comply with this obligation is still unclear. However, it seems unavoidable that compliance will require them to install measures such as automated filtering (so-called ‘upload filters’) using algorithms to prevent users from uploading unlawful content. Given the scale of the obligation, there is a real danger that measures taken by OCSSPs in fulfilment of their obligation will amount to expressly prohibited general monitoring. What seems certain, however, is that the automated filtering, whether general or specific in nature, cannot distinguish appropriately between illegitimate and legitimate use of content (e.g. because it would be covered by a copyright limitation). Hence, there is a serious risk of overblocking certain uses that benefit from strong fundamental rights justifications such as the freedom of expression and information or freedom of artistic creativity. This article first outlines the relevant fundamental rights as guaranteed under the EU Charter of Fundamental Rights and the European Convention of Human Rights that are affected by an obligation to monitor and filter for copyright infringing content. Second, it examines the impact on fundamental rights of the obligations OCSSPs incur under Art. 17, which are analysed and tested also with regard to their compatibility with general principles of EU law such as proportionality and legal certainty. These are, on the one hand, obligations to prevent the upload of works for which they have not obtained authorization and, on the other, an obligation to remove infringing content upon notification and prevent the renewed upload in relation to these works and protected subject matter (so-called ‘stay-down’ obligations). Third, the article assesses the mechanisms to safeguard the right of users of online content-sharing services under Art. 17. The analysis demonstrates that the balance between the different fundamental rights in the normative framework of Art. 17 CDSM Directive is a very difficult one to strike and that overly strict and broad enforcement mechanisms will most likely constitute an unjustified and disproportionate infringement of the fundamental rights of platform operators as well as of users of such platforms. Moreover, Art. 17 is the result of hard-fought compromises during the elaboration of the Directive, which led to the adoption of a long provision with complicated wording and full of internal contradictions. As a consequence, it does not determine with sufficient precision the balance between the multiple fundamental rights affected, nor does it provide for effective harmonization. These conclusions are of crucial importance for the development of the regulatory framework for the liability of platforms in the EU since the CJEU will have to rule on the compatibility of Art. 17 with fundamental rights in the near future, as a result of an action for annulment filed by the Polish government. In fact, if certain features of the article are considered incompatible with the constitutional framework of the EU, this should lead to the erasing of certain paragraphs and, possibly, even of the entire provision from the text of the CDSM Directive.


2020 ◽  
Vol 12 (1) ◽  
pp. 962
Author(s):  
Gerald Spindler

Resumen: El “filtro de carga” para las plataformas de intercambio en línea ha sido uno de los temas más candentes en relación con la nueva Directiva 2019/790 sobre derechos de autor y derechos afines en el Mercado Único Digital (DCMUD). La batalla continúa en el ámbito nacional a la hora de afrontar la correcta y equilibrada transposición del art. 17 DCMUD, en particular en lo que respecta a garantizar la libertad de expresión. Este ensayo explora el sistema de responsabilidad del citado art. 17 y analiza su potencial contradicción con los derechos fundamentales de la Unión Europea, tal y como han sido ponderados por el TJUE en el asunto SABAM Netlog y otros recientes, en particular, con la pro­hibición de deberes generales de control y supervisión. Aunque sería posible argumentar que el art. 17 DCMUD podría superar ese examen de contraste, el precepto articula varias opciones de transposición que podrían emplearse para garantizar los derechos de los usuarios y la libertad de expresión.Palabras clave: propiedad intelectual, derechos de autor, plataformas de intermediación, puertos seguros, filtros de carga, contenidos digitales, contenidos generados por los usuarios, derechos funda­mentales, libertad de expresión, liability, e-commerce.Abstract: The “upload-filter” for online sharing platforms have been one of the hot issues regar­ding the new DSM-directive on copyright. The battle continues at the national level concerning the correct and balanced implementation of Art. 17 DSM-D, in particular regarding the guarantee of free­dom of speech. The article explores the liability system of Art. 17 DSM-D and analyzes its potential contradiction to fundamental EU rights which has been laid down by the CJEU in the SABAM Netlog case, in particular the prohibition of general monitoring duties. Even though one might argue that Art. 17 DSM-D could pass that test the article develops several implementation options in order to safeguard user rights and freedom of speech.Keywords: copyright, intermediary platforms, safe harbours, upload filters, digital contents, user generated contents, fundamental rights, right of free speech, responsabilidad, comercio electrónico.


2017 ◽  
Author(s):  
Giancarlo Frosio

As part of its Digital Single Market Strategy, the European Commission would like to introduce vertical regulations, replacing — or better conflicting with — the well-established eCommerce Directive horizontal intermediary liability regime. An upcoming revision of the Audio-visual Media Services Directive would ask platforms to put in place measures to protect minors from harmful content and to protect everyone from incitement to hatred. Meanwhile — under the assumption of closing a ‘value gap’ between rightholders and online platforms allegedly exploiting protected content — the Draft Directive on Copyright in the Digital Single Market would implement filtering obligations for intermediaries. Finally, the EU Digital Single Market Strategy has also endorsed voluntary measures as a privileged tool to curb illicit and infringing activities online. Each of these actions will erode the eCommerce intermediary liability arrangement by bringing in — in a way or another — proactive monitoring obligations and causing a systemic shift from a negligence-based to a strict liability regime for hosting providers. This systemic shift would apparently occur against public consensus and absent any justification based on empirical evidence. Nonetheless, it will bring about dire consequences by pushing privatization of enforcement online through algorithmic intelligence, based on murky, privately-enforced standards, rather than transparent legal obligations. This reform might cause a policy earthquake that will shake and crack EU law’s systemic consistency, due process and fundamental rights online.


2017 ◽  
Author(s):  
Giancarlo Frosio

In imposing a strict liability regime for alleged copyright infringement occurring on YouTube, Justice Salomão of the Brazilian Superior Tribunal de Justiça stated that “if Google created an ‘untameable monster,’ it should be the only one charged with any disastrous consequences generated by the lack of control of the users of its websites.” In order to tame the monster, the Brazilian Superior Court had to impose monitoring obligations on YouTube. This was not an isolated case. Proactive monitoring and filtering found their way in the legal system as a privileged enforcement strategy through legislation, judicial decisions and private ordering. In multiple jurisdictions, recent case law has imposed proactive monitor obligations on intermediaries. These cases uphold proactive monitoring across the entire spectrum of intermediary liability subject matters: intellectual property, privacy, defamation, and hate/dangerous speech. In this context, however, notable exceptions—such as the landmark Belen case in Argentina—highlight also a fragmented international response. Legislative proposals have been following suit. As part of its Digital Single Market Strategy, the European Commission, would like to introduce filtering obligations for intermediaries to close a “value gap” between rightholders and online platforms allegedly exploiting protected content. In addition, proactive monitoring and filtering obligations would also feature in an update of the European audio-visual media legislation. Meanwhile, online platforms have already set up miscellaneous filtering schemes on a voluntary basis.In this paper, I suggest that we are witnessing the death of “no monitoring obligations,” a well-marked trend in intermediary liability policy. Current Internet policy—especially in Europe—is silently drifting away from a fundamental safeguard for freedom of expression online. In this respect, this paper would like to contextualize this trend within the emergence of a broader move towards private enforcement online. The EU Digital Single Market Strategy apparently endorsed voluntary measures as a privileged tool to curb illicit and infringing activities online. As I argued elsewhere, the intermediary liability discourse is shifting towards an intermediary responsibility discourse. This process might be pushing an amorphous notion of responsibility that incentivizes intermediaries’ self-intervention. In addition, filtering and monitoring will be dealt almost exclusively by intermediaries through automatic infringement assessment systems. Due process and fundamental guarantees get mauled by algorithmic enforcement, limiting enjoyment of exceptions and limitations, use of public domain works, and silencing speech according to the mainstream ethical discourse. The upcoming reform—and the border move that it portends—might finally slay “no monitoring obligations” and fundamental rights online, together with the untameable monster.


2020 ◽  
Author(s):  
Oliver Gerstenberg

Abstract The EU, in its present configuration, has often been accused of a persistent and deep structural bias in favour of economic integration to the detriment of the democratic and social values of its Member States. In response to that accusation, can the Charter of Fundamental Rights of the EU (CFREU) come to the rescue and be mobilized, ultimately before a judicially-activist Court of Justice of the EU (CJEU), as a vehicle of social justice, in an effort to correct bias and to counter-balance the expansive economic liberties of the European single market? Exploring this question is a timely topic given a clearly discernable new constitutional turn in the jurisprudence of the CJEU’s Grand Chamber, especially now under the current presidency of Koen Lenaerts. The ‘Lenaerts-Court’, as this article will argue, has embarked on a new EU fundamental-rights jurisprudence, visibly aimed at strengthening the dignitarian-social dimension of EU integration and at adding flesh to the bones of the commitment to a European social market economy in Article 3(3) of the Treaty of European Union (TEU). Yet proposals in support of greater reliance on the substantive, but open-textured, provisions of the CFREU, in the pursuit of a ‘fair balance’ between the EU’s economic and dignitarian-social dimensions, immediately run into democratic-minded concerns about sovereignty passing from the Member States to the courts, and ultimately to the CJEU itself. The persistent worry is that democratic sovereignty over constitutionally sensitive—but morally and politically divisive—choices is being turned into a ‘sovereignty of law’—in ways that not only risk foreclosure of democratic debate over yet unsettled key societal matters but gives up democratic legitimation as a central element of modern constitutionalism (‘over-constitutionalisation’, Dieter Grimm). Thus, the CJEU is being simultaneously criticized for its alleged economic bias and for its efforts to overcome that bias. In an effort to address—and disarm—this democratic-minded concern, this article argues that judicial emphasis on the CFREU’s dignitarian-social values need not per se lead to the consequence of over-constitutionalisation. Rather, this article proposes to look at the Grand Chamber’s new fundamental-rights jurisprudence in the single-market context as creating a framework for plural and inclusive democratic deliberation on key societal choices and values. To that end, the article proposes a new reading of the Grand Chamber’s jurisprudence on the efficacy of fundamental rights in the economic sphere and, in particular, on the horizontal direct effect of CFREU rights.


Author(s):  
Oreste Pollicino ◽  
Giovanni De Gregorio

The role of online intermediaries has changed since the adoption of the e-Commerce Directive in 2000. The implementation of artificial intelligence technologies in online content management has challenged the original passive role of online service providers in relation to third-party content. As a result, the EU strategy has shifted from a liberal approach to ensure the development of new digital services without overwhelming ISPs of monitoring and removal obligations to the regulation of online content management activities. The threats for fundamental rights deriving from opaque decision-making processes of online content have overcome the traditional narrative of ISPs’ freedom to conduct their business. The result of this process has led to a new regulatory phase within the framework of the Digital Single Market strategy.


Sign in / Sign up

Export Citation Format

Share Document