Platform liability under Article 17 of the Copyright in the Digital Single Market Directive, Automated Filtering and Fundamental Rights: An Impossible Match
Forthcoming in GRUR International 2021
50 Pages Posted:
Date Written: January 30, 2021
The Directive on Copyright in the Digital Single Market (CDSM Directive) introduced a change of paradigm with regard to the liability of some platforms in the European Union. Under the safe harbour rules of the Directive on electronic commerce (E-Commerce Directive), intermediaries in the EU were shielded from liability for acts of their users committed through their services, provided they had no knowledge of it. Although platform operators could be required to help enforce copyright infringements online by taking down infringing content, the E-commerce Directive also drew a very clear line that intermediaries could not be obliged to monitor all communications of their users and install general filtering mechanisms for this purpose. The Court of Justice of the European Union confirmed this in a series of cases, amongst other reasons because filtering would restrict the fundamental rights of platform operators and users of intermediary services. Twenty years later, the regime for online intermediaries in the EU has fundamentally shifted with the adoption of Article 17 CDSM Directive, the most controversial and hotly debated provision of this piece of legislation. For a specific class of online intermediaries called “online content-sharing providers” (OCSSPs), uploads of infringing works by their users now result in direct liability and they are required undertake “best efforts” to obtain authorization for such uploads. With this new responsibility come further obligations, which oblige OCSSPs to make best efforts to ensure that works for which they have not obtained authorization are not available on their services. How exactly OCSSPs can comply with this obligation is still unclear. However, it seems unavoidable that compliance will require them to install measures such as automated filtering (so-called “upload filters”) using algorithms to prevent users from uploading unlawful content. Given the scale of the obligation, there is a real danger that measures taken by OCSSPs in fulfilment of their obligation will amount to expressly prohibited general monitoring. What seems certain however is that the automated filtering, whether general or specific in nature, cannot distinguish appropriately between illegitimate and legitimate use of content (e.g. because it would be covered by a copyright limitation). Hence, there is a serious risk of over-blocking of certain uses that benefit from strong fundamental rights justifications such as the freedom of expression and information or freedom of artistic creativity.
This article first outlines the relevant fundamental rights as guaranteed under the EU Charter of Fundamental Rights and the European Convention of Human Rights that are affected by an obligation to monitor and filter for copyright infringing content. Second, it examines the impact on fundamental rights of the obligations OCSSPs incur under Article 17, which are analysed and tested also with regard to their compatibility with general principles of EU law such as proportionality and legal certainty. These are, on the one hand, obligations to prevent the upload of works for which they have not obtained authorization and, on the other, an obligation to remove infringing content upon notification and prevent the renewed upload in relation to these works and protected subject matter (so-called “stay down”- obligations). Third, the article assesses the mechanisms to safeguard the right of users of online content-sharing services under Article 17. The analysis demonstrates that the balance between the different fundamental rights in the normative framework of Article 17 CDSM Directive is a very difficult one to strike and that overly strict and broad enforcement mechanisms will most likely constitute an unjustified and disproportionate infringement of the fundamental rights of platform operators as well as of users of such platforms. Moreover, Article 17 is the result of hard-fought compromises during the elaboration of the Directive which led to the adoption of a long provision with complicated wording and full of internal contradictions. As a consequence, it does not determine with sufficient precision the balance between the multiple fundamental rights affected, nor does it provide for effective harmonization. These conclusions are of crucial importance for the development of the regulatory framework for the liability of platforms in the EU since the CJEU will have to rule on the compatibility of Article 17 with fundamental rights in the near future, as a result of an action for annulment of introduced by the Polish government. In fact, if certain features of that article are to be considered incompatible with the constitutional framework of the EU, this should lead to the erasing of certain paragraphs and, possibly, even of the entire provision from the text of the CDSM directive.
Keywords: Copyright law, Article 17, DSM Directive, Upload Filters, Automated Content Recognition, General Monitoring Obligations
JEL Classification: K39
Suggested Citation: Suggested Citation