I. Introduction
Online platforms provide the main points of access to information and other content in the digital age, whether through “search engines, social networks, micro-blogging sites or video-sharing platforms”.Footnote 1 Although these platforms bring economic and social benefits, they also enable the unprecedented spread of illegal content, including incitement to terrorism, hate speech and copyright infringement.Footnote 2 This is particularly true for so-called Big Tech platforms – such as Facebook and YouTube – that have amassed significant power over online speech and commerce in the past decade.Footnote 3
Against this background, the European Union (EU) and its Member States have proposed and adopted a growing number of laws and policies to regulate online content, with a focus on enhancing the responsibility of hosting platforms for user-uploaded illegal content.Footnote 4 The centrepiece of the European Commission’s (EC) strategy in this respect was published on 15 December 2020: the proposal for a Regulation on a Single Market for Digital Services (Digital Services Act; DSA),Footnote 5 which amends the e-Commerce DirectiveFootnote 6 for certain Internet intermediaries. The DSA carries out a regulatory overhaul of the twenty-one-year-old horizontal rules on intermediary liability in the e-Commerce Directive.
In this article, we use doctrinal legal analysis to look at how the DSA’s rules interplay with sector-specific, lex specialis rules. This question is relevant both for specific EU legislation, such as on copyrightFootnote 7 and terrorist content,Footnote 8 as well as for national sector regulation. The focus of our legal analysis, however, is on online platforms and copyright-protected material.Footnote 9
With regard to copyright-protected material, Article 17 of the Directive on Copyright in the Digital Single Market (CDSM Directive), which preceded the DSA, establishes a new liability regime for online content-sharing service providers (OCSSPs). These rules had to be implemented by EU Member States on 7 June 2021.Footnote 10 Both Article 17 CDSM Directive as well as multiple provisions of the DSA impose obligations on how online platforms deal with illegal information. Whereas Article 17 CDSM Directive targets copyright-infringing content, the DSA proposal targets illegal content in general, including that which infringes copyright.
This raises the question of how the two frameworks will interact once both enter into force. Besides the different natures of the legal instruments (Regulation versus Directive), this question is of high relevance, first and foremost where the frameworks differ. At first sight, these regimes may not appear to overlap since Article 17 CDSM Directive is lex specialis to the DSA. A closer look, however, reveals a much more complex picture. The proposed DSA regulation is complementary to Article 17 CDSM Directive and imposes a number of additional obligations on online platforms that qualify as OCSSPs. But the extent to which these obligations apply – and in some cases whether they do apply – is unclear. This article examines and maps this underexplored intersection between the CDSM Directive and the DSA.Footnote 11 Our legal analysis illuminates a point that has so far received little attention: the extent to which the DSA provides a new regulatory approach to online platforms through horizontal rules that extend to most corners of EU law, even where that reach appeared precluded or limited by specific legislation to be implemented at the national level. In the case of copyright, the issue is especially complex due to its territorial nature, leading to a multi-layered enforcement problem.Footnote 12 Varying national implementations of Article 17 CDSM Directive, which are outside the scope of this article, would further complicate matters.
This article carries out a doctrinal analysis of this particular legal question. To be sure, there are additional legal and empirical angles from which to address the underlying objective of the legal instruments under analysis. As noted, both the CDSM Directive (in a more targeted, sector-specific manner) and the DSA (in a general horizontal approach) aim to curb the increasing power and “digital dominance” of Big Tech companies, primarily by subjecting them to additional liability and obligations for the illegal (and even the harmful) content they host.Footnote 13 Our analysis only captures a small part of the regulatory and normative complexity involved in this task.
On the one hand, some of the legal solutions to curb platform power are found elsewhere in proposals that attempt to address the anti-competitive practices of Big Tech as so-called “gatekeepers”, such as the Digital Markets Act.Footnote 14 On the other hand, much of the power enjoyed by these platforms results from content-moderation rules, technologies and processes adopted by “platforms” proper (ie a form of private ordering). This type of regulation can fit into two broad categories. First are terms of service and similar documents (community guidelines, etc.) adopted by platforms, referred to by some authors as “platform law”.Footnote 15 In EU law, this would include, for instance, what is covered by the definition of “terms and conditions” in the proposed DSA.Footnote 16 Second, regulation of platforms can be carried out through technological devices or code, such as in the case of algorithmic moderation systems (eg for filtering of illegal content).Footnote 17 Big Tech platforms have long developed complex terms and conditions and content-recognition systems or tools that de facto govern their treatment of the illegal and harmful content they host.Footnote 18 In the particular case of copyright-protected content, these systems are perhaps at their most developed, and they include well-known examples such as YouTube’s suite of copyright management tools – most notably ContentID – and Facebook’s Rights Manager.Footnote 19 This is not surprising, since “[e]mpirically, copyright law accounts for most content removal from online platforms, by an order of magnitude”.Footnote 20 Outside the sphere of copyright, platforms mostly use different content-recognition tools for separate types of illegal or harmful/undesirable content (eg terrorism, violence, “toxic speech”, child abuse, sexual content, spam) that, in simple terms, “match content to known images, text or video in a database and classification tools which can classify new images as part of pre-defined categories”.Footnote 21 Although our analysis makes reference to some of these aspects, it focuses on the legal question above and can therefore only offer a modest contribution to this debate.
The article proceeds as follows. After this introduction, we provide a snapshot of the complex regime set out in Article 17 CDSM Directive, providing a baseline understanding for the subsequent analysis (Section II). We then move to the heart of our analysis, explaining why and how the DSA liability regime and especially its asymmetric due diligence obligations apply to online platforms that host and provide access to copyright-protected content, despite – and in addition to – the specific rules in Article 17 CDSM Directive, to be implemented into national law in light of Guidance issued by the Commission (Section III).Footnote 22 We conclude with the key findings of our analysis and suggestions for clarifications in the further legislative process (Section IV).
II. OCSSPs and Article 17 CDSM Directive
1. Overview
OCSSPs are a novel concept defined in Article 2(6) CDSM Directive, with further guidance in Recitals 62 and 63. They are providers of an information society service whose main purpose is to store and give the public access to a large amount of protected content by its users, provided it organises and promotes that content for profit-making purposes. The definition also contains a number of exclusions aimed at services that are either not aimed primarily at giving access to copyright-protected content and/or are primarily not for-profit (eg service providers such as Skype, Dropbox, eBay, Wikipedia, ArXiv.org and GitHub).Footnote 23
While this concept is new to the copyright acquis, OCSSPs do not appear to constitute a wholly new category of service providers in a technological or business sense. Rather, this is a new legal category covering a type of provider of hosting services whose activities or functions were previously regulated in different legal instruments, such as the e-Commerce Directive,Footnote 24 the InfoSoc DirectiveFootnote 25 and the Enforcement Directive.Footnote 26 Figure 1 represents this relationship.
Article 17 is an extremely complex legal provision. As Dusollier notes, it is the “monster provision” of the Directive, “both by its size and hazardousness”.Footnote 27 There is perhaps no better testament to this than the wealth of legal scholarship that already exists on Article 17, even before its national implementation deadline.Footnote 28
In simple terms, Article 17 states that OCSSPs carry out acts of communication to the public when they give access to works/subject matter uploaded by their users. As a result, these providers become directly liable for their users’ uploads. They are also expressly excluded in paragraph (3) from the hosting safe harbour for copyright-relevant acts, which was previously available to many of them under Article 14(1) e-Commerce Directive. Arguably, this makes Article 17 lex specialis to the e-Commerce Directive.Footnote 29
The provision then introduces a complex set of rules to regulate OCSSPs, including a liability-exemption mechanism in paragraph (4) and a number of what can be referred to as mitigation measures and safeguards.
The liability-exemption mechanism is composed of best-efforts obligations for preventative measures, including those aimed at filtering content ex ante, at notice and stay-down and at notice and takedown.Footnote 30 In particular, Article 17(4) establishes three cumulative conditions for this liability-exemption mechanism. The first condition is that OCSSPs must demonstrate that they have made best efforts to obtain an authorisation.Footnote 31 If this obligation is met, then OCSSPs are subject to two further cumulative conditions in paragraphs (b) and (c). Namely, they must demonstrate that they have: (1) made best efforts to ensure the unavailability of specific works for which the rights holders have provided them with the relevant and necessary information; and (2) acted expeditiously, subsequent to notice from rights holders, to take down infringing content and made best efforts to prevent its future upload. Condition (1) appears to impose what critics label an upload filtering obligation, whereas Condition (2) introduces both a notice-and-takedown mechanism (similar to that of Article 14 e-Commerce Directive) and a notice-and-stay-down (or re-upload filtering) obligation.Footnote 32
Among the mitigation measures and safeguards that Article 17 includes we find the following: first, the requirements for a proportionality assessment and the identification of relevant factors for preventative measuresFootnote 33 ; second, a special regime for small and new OCSSPsFootnote 34 ; third, a set of mandatory exceptions akin to user rights or freedoms that are designed as obligations of result expressly based on fundamental rightsFootnote 35 ; fourth, a clarification that Article 17 does not entail general monitoring, although without providing much insight into its relation to the prohibition contained in Article 15 e-Commerce DirectiveFootnote 36 ; and fifth, a set of procedural safeguards, including an in-platform complaint and redress mechanism and rules on out-of-court redress mechanisms.Footnote 37
Finally, Article 17(10) tasks the EC with organising stakeholder dialogues to ensure uniform application of the obligation of cooperation between OCSSPs and rights holders and to establish best practices regarding the appropriate industry standards of professional diligence. After much delay, the Guidance from the EC was finally published as a Communication on 4 June 2021, a mere working day before the transposition deadline of the CDSM Directive on 7 June 2021. The Guidance was adopted as a Communication and is therefore not binding.Footnote 38 Furthermore, as the Guidance itself states, it might have to be reviewed in light of the Court of Justice of the European Union (CJEU) judgment in C-401/192.Footnote 39 In fact, the Opinion of the Advocate General (AG) in that case suggests that key aspects of the Guidance might not be in conformity with fundamental rights.Footnote 40 Still, the Guidance is a rich document that is bound to influence national implementations. This Guidance, we note, only refers to the DSA once.Footnote 41
To be sure, Big Tech platforms such as YouTube (ContentID, Copyright Match Tool and Web FormFootnote 42 ) and Facebook (Rights Manager) already contain content-recognition tools, including the type of filtering and blocking measures required by Article 17’s liability-exemption mechanism. But this is not necessarily true for the majority of other smaller-scale platforms, who will be required to implement tools obtained from private third-party providers, such as Audible Magic and Pex.Footnote 43 In this sense, an unintended consequence of Article 17 is that it translates into a competitive advantage for bigger OCSSPs over smaller providers.Footnote 44 Additionally, it is also clear that the most advanced of current filtering technologies – based on matching, fingerprinting or hashing algorithms – are incapable of recognising the types of uses covered by mandatory copyright exceptions or limitations,Footnote 45 leading to a conflict between different obligations within Article 17.Footnote 46 This conflict is explored below.
2. Normative hierarchy of obligations and safeguards
In light of the above, it is important to further explain the normative hierarchy embedded in Article 17 as well as to provide additional detail on its complaint and redress rules.
Article 17(7) includes a general and a specific clause on exceptions and limitations to copyright. The general clause is contained in the first subparagraph, which states that the obligations in 4(b) and (c) should not prevent content uploaded by users from being available on OCSSPs if such an upload does not infringe copyright, including if it is covered by an exception.Footnote 47 The second paragraph of Article 17(7) CDSM Directive includes a special regime for certain exceptions and limitations: (1) quotation, criticism or review; and (2) use for the purpose of caricature, parody or pastiche.Footnote 48 Additionally, Article 17(9) requires that OCSSPs inform users in their terms and conditions of the user’s right to use works under exceptions. Footnote 49
One key feature of the legal design of Article 17 is that paragraph (7) translates into an obligation of result. That is to say, Member States must ensure that these exceptions are respected despite the preventative measures in Article 17(4). This point matters because paragraph (4) merely imposes “best-efforts” obligations. The different natures of the obligations, underscored by the fundamental rights basis of paragraph (7),Footnote 50 indicate a normative hierarchy between the higher-level obligation in paragraph (7) and the lower-level obligation in paragraph (4). This matters not only for legal interpretation of Article 17 in general, but also for the assessment of content-moderation obligations in this legal regime. For instance, this legal understanding justifies the view that, in order to comply with Article 17, it is insufficient to rely on the ex post complaint and redress mechanisms in Article 17(9). Compliance also requires the maintenance of ex ante safeguards that avoid the over-blocking of uploaded content by the content-filtering technologies used by OCSSPs that are incapable of carrying out the type of contextual assessment required under Article 17(7).Footnote 51
It is on this basis that Poland filed an action for annulment against Article 17 for failure to sufficiently safeguard the right to freedom of expression of users.Footnote 52 In his Opinion, AG Saugmandsgaard Øe delineated the scope of permissible filtering of users’ uploads.Footnote 53 While acknowledging that OCSSPs will have to deploy filtering and content-recognition systems to comply with their best-efforts obligations, the AG relies on the judgment in Eva Glawischnig-Piesczek to argue that any filtering must be “specific” to the content and information at issue, so as not to run afoul of the prohibition of general monitoring obligations in Article 15 e-Commerce Directive (and Article 17(8) CDSM Directive).Footnote 54 However, such filtering must be proportionate and avoid the risk of chilling effects on freedom of expression through over-blocking; in order to do so, it must be applied only to manifestly infringing or “equivalent” content.Footnote 55 All other uploads should benefit from a “presumption of lawfulness” and be subject to the ex ante and ex post safeguards embedded in Article 17, notably judicial review.Footnote 56
In this respect, Article 17(9) further includes certain ex post or procedural safeguards at: (1) the platform level; (2) the out-of-court level; and (3) the judicial or court level. Footnote 57 A few additional remarks are justified on the first two levels.
At the platform level, Member States are mandated to provide that OCSSPs “put in place an effective and expeditious complaint mechanism that is available to users of their services in the event of disputes over the disabling of access to, or the removal of, works or other subject matter uploaded by them”. Footnote 58 These mechanisms are further circumscribed insofar as complaints “shall be processed without undue delay, and decisions to disable access to or remove uploaded content shall be subject to human review”. Footnote 59 The latter human review criterion implies that everything leading up to a dispute can be processed by the platform in an automated fashion by algorithms. Footnote 60 It is further specified that these mechanisms should allow “users to complain about the steps taken with regard to their uploads, in particular where they could benefit from an exception or limitation to copyright in relation to an upload to which access has been disabled or that has been removed”.Footnote 61
Furthermore, the provision stipulates a justification duty on rights holders. The reasons for a rights holder’s request to make content unavailable needs to be “duly justified”. Footnote 62 The decision at this level remains with the platform, but as Senftleben notes, “The underlying legal assessment, however, is likely to be cautious and defensive … [and] a generous interpretation of copyright limitations serving freedom of expression seems unlikely, even though a broad application of the right of quotation and the parody exemption would be in line with CJEU jurisprudence”. Footnote 63 In other words, there is a risk of over-enforcement. Footnote 64
In addition to the platform-based procedural safeguards, out-of-court redress mechanisms for the impartial settlement of disputes are also to be put in place by Member States. Footnote 65 This mechanism is “without prejudice to the rights of users to have recourse to efficient judicial remedies …”. Footnote 66 Specifically in relation to exceptions, “Member States shall ensure that users have access to a court or another relevant judicial authority to assert the use of” the same. Footnote 67 Member States enjoy a considerable amount of discretion when implementing these procedural safeguards, and such mechanisms might also be informed by the EC Guidance on Article 17, which provides significant detail on how Member States may implement the safeguards in paragraphs (7) and (9). Footnote 68
III. The interplay between the DSA and the CDSM Directive
Against this background, the DSA proposal was published on 15 December 2020. The DSA is a regulation that is meant inter alia as a “REFIT”Footnote 69 of certain parts of the e-Commerce Directive. Other than the different legal nature of the proposed instrument – Regulation versus Directive – the DSA has a broader scope than the e-Commerce DirectiveFootnote 70 and sets up a much more detailed procedural framework, which is further explored below.Footnote 71
The proposed DSA is divided into five chapters: general provisions (Chapter I); liability of providers of intermediary services (Chapter II); due diligence obligations for a transparent and safe online environment (Chapter III); implementation, cooperation, sanctions and enforcement (Chapter IV); and final provisions (Chapter V). For the purposes of this article, we are mostly concerned with Chapters I–III.
The liability exemptions in Chapter II largely resemble the system set forth twenty-one years ago in the e-Commerce Directive,Footnote 72 with notable adjustments such as a “Good Samaritan”-like rule,Footnote 73 clarifications on scope in recitalsFootnote 74 and provisions on orders: to act against illegal content; and to provide information.Footnote 75 Separate from this, the proposal suggests the introduction of asymmetric due diligence obligations in Chapter III, which is a novelty compared to the e-Commerce Directive.
1. Are rules on copyright excluded from the DSA?
In this article, we are interested in the potential overlap between the proposed DSA and Article 17 CDSM Directive. This is visualised in the illustration in Figure 2, which represents the overlaps between the concepts of online platforms, very large online platforms (VLOPs) and OCSSPs. Similar overlaps could be envisaged as regards the relationship of the DSA proposal’s concepts with those of platforms used by providers in other sector-specific areas, such as “video-sharing platform services” in the Audiovisual Media Services Directive (AVMSD)Footnote 76 and “hosting services” used for the dissemination to the public of terrorist content online in the Terrorist Content Regulation.Footnote 77
A preliminary question for our purposes is whether the DSA applies to OCSSPs in the first place. Importantly, the special “copyright” regime for OCSSPs only relates to the copyright-relevant portion of an online platform that qualifies as an OCSSP. Article 17(3) subparagraph 2 CDSM Directive states clearly that the hosting safe harbour of Article 14 e-Commerce Directive – and correspondingly that in Article 5 DSA – still applies to OCSSPs “for purposes falling outside the scope of this Directive”. Consider the example of YouTube, which qualifies as an OCSSP. If the relevant information or content it hosts relates to copyright, Article 17 CDSM Directive applies. If the relevant information, however, relates to hate speech or child sexual abuse material or any other illegal information or content,Footnote 78 the e-Commerce Directive’s – and correspondingly the DSA’s – hosting liability exemption is the place to look. In other words, YouTube would be considered an OCSSP (in the context of copyright) and also a VLOP (in the context of other information; see Figure 3).Footnote 79
In the following, we focus on the copyright aspects. Article 1(5)(c) DSA states that the proposed Regulation is “without prejudice to the rules laid down by … Union law on copyright and related rights”. Supporting Recital 11 adds that the “Regulation is without prejudice to the rules of Union law on copyright and related rights, which establish specific rules and procedures that should remain unaffected”. Read alone, this Recital could be understood as the Commission’s view that Article 17 CDSM Directive, in our example, indeed contains the answers to all questions regarding obligations of OCSSPs. In our view, however, “unaffected”Footnote 80 can only relate to aspects that indeed are specifically covered by those rules.
Recital 11 (similar to Recital 10), however, is only a further example of areas of application of the general principle contained in Recital 9, aimed at providing further clarity on the interplay between the horizontal rules of the DSA and sector-specific rules. Recital 9 states that the DSA
… should complement, yet not affect the application of rules resulting from other acts of Union law regulating certain aspects of the provision of intermediary services … Therefore, this Regulation leaves those other acts, which are to be considered lex specialis in relation to the generally applicable framework set out in this Regulation, unaffected. However, the rules of this Regulation apply in respect of issues that are not or not fully addressed by those other acts as well as issues on which those other acts leave Member States the possibility of adopting certain measures at national level.Footnote 81
The Explanatory Memorandum repeats this text and provides as one example the obligations set out in the AVMSD on video-sharing platform providers as regards audiovisual content and audiovisual commercial communications. It continues that such rules “will continue to apply”, but that the DSA “applies to those providers to the extent that the AVSMD or other Union legal acts, such as the proposal for a Regulation on addressing the dissemination on terrorist content online, do not contain more specific provisions applicable to them”.Footnote 82
Applying this logic to the CDSM Directive, this means that the specific rules and procedures contained in Article 17 for OCSSPs are likely to be considered lex specialis to the DSA.Footnote 83 Conversely, the DSA will apply to OCSSPs insofar as it contains: (1) rules that regulate matters not covered by Article 17 CDSM Directive; and (2) specific rules on matters where Article 17 leaves a margin of discretionFootnote 84 to Member States. As we demonstrate below, whereas Category (1) is more or less straightforward, Category (2) is more challenging. In our view, the changes proposed to the DSA provisions examined above on the relationship between the copyright acquis and the DSA during the legislative process (by the Council and different European Parliament committees) do not affect the validity of our conclusions.Footnote 85 A similar conclusion appears valid for other types of illegal content.
2. Potentially applicable rules
At this stage, it is important to note that the DSA contains a bifurcated approach to regulation. On the one hand, Chapter II sets out a regime for the liability of providers of intermediary services.Footnote 86 This regime distinguishes between functions, namely “mere conduit”, “caching” and hosting. It is in essence a revamped version of the existing rules on liability exemption (also known as safe harbours) and bans on general monitoring in Articles 12–15 e-Commerce Directive.Footnote 87 As noted, the main differences are the addition of a “Good Samaritan”-like rule in Article 6Footnote 88 and provisions on orders to act against illegal content (Article 8) and to provide information (Article 9). On the other hand, Chapter III sets out “horizontal”Footnote 89 due diligence obligations for a transparent and safe online environment.Footnote 90 This regime distinguishes between categories of providers by setting out asymmetric obligations that apply in a tiered way to different categories of providers of information society services. As a starting point, the liability-exemption regime on the one hand and the due diligence obligations on the other are separate from each other. In other words, the availability of a liability exemption is not dependent on compliance with due diligence obligations and vice versa.Footnote 91
In this respect, the DSA retains in Article 2(a) the definition of “information society services” of the e-Commerce Directive that underpins the notion of an information society service provider. For the purposes of due diligence obligations, it then proposes a distinction between four categories of services, from the general to increasingly more specific: (1) intermediary services; (2) hosting services; (3) online platforms; and (4) VLOPs.Footnote 92 These are visualised in Figure 4.
Intermediary services – the broadest category – comprises “mere conduit”, “caching” or hosting services.Footnote 93 Hosting services consist of “the storage of information provided by, and at the request of, a recipient of the service”.Footnote 94 Online platforms are defined as providers of “a hosting service which, at the request of a recipient of the service, stores and disseminates to the public information, unless that activity is a minor and purely ancillary feature of another service and, for objective and technical reasons cannot be used without that other service, and the integration of the feature into the other service is not a means to circumvent the applicability of this Regulation”.Footnote 95 In simple terms, VLOPs are those online platforms that provide their services to a number of average monthly active recipients of the service in the EU greater than or equal to 45 million (ie representing 10% of the European population).Footnote 96 In practical terms, only the major user-upload Big Tech platforms operating in the current digital ecosystem – such as YouTube, Facebook or Instagram – would qualify as VLOPs.Footnote 97 Under the asymmetric obligations approach of Chapter III DSA, VLOPs are subject to the highest number of cumulative obligations.Footnote 98 This is justified by the “systemic role” played by such platforms in “amplifying and shaping information flows online” and by the fact that “their design choices have a strong influence on user safety online, the shaping of public opinion and discourse, as well as on online trade”.Footnote 99
In our view, when contrasting the definitions in the DSA and the CDSM Directive, it is clear that the notion of OCSSP covers at least (certain) online platforms and VLOPs, as represented in Figure 4.
In light of this overlap, the legal question that arises is the extent to which the proposed DSA’s liability rules (in Chapter II) and the asymmetric obligations (in Chapter III) apply to OCSSPs as online platforms or VLOPs. Although the analysis below focuses on copyright, it provides a blueprint for a similar examination of the DSA liability regime and obligations that would apply to other sector-specific instruments. For instance, it could help shed light on the articulation of the DSA with the AVMSD, which already imposes certain obligations on video-sharing platform services to protect minors and EU citizens from certain categories of illegal and harmful content,Footnote 101 while attaching “cooperative responsibility to [those] platforms’ organisational control”.Footnote 102
a. DSA liability regime and OCCSPs
In our view, the liability regime in the DSA is partly excluded for OCSSPs. First, the hosting safe harbour (in Article 5 DSA) is meant to replace Article 14 e-Commerce Directive.Footnote 103 As such, its application is set aside by the express reference to it in Article 17(3) CDSM Directive, to the extent that the activities at issue fall within the scope of Article 17 CDSM Directive.Footnote 104
On the other hand, the general monitoring ban in Article 7 DSA, which aims to replace the similar prohibitionFootnote 105 in Article 15 e-Commerce Directive, appears to not be touched by the CDSM Directive. Article 17(8) CDSM Directive merely states that “the application of this Article shall not lead to any general monitoring obligation”. It does not set aside the application of Article 15 e-Commerce Directive, meaning that it can be understood as being of a merely declaratory nature.Footnote 106
Things are, however, less clear for the “Good Samaritan” rule in Article 6 DSA on “voluntary own-initiative investigations and legal compliance”. Given the direct reference to the liability exemptions in the DSA, its application appears to be directly connected (for our purposes) to the specific hosting safe harbour, which does not apply to OCSSPs as per Article 17(3) CDSM Directive. In a narrow reading, this direct connection could be interpreted as precluding Article 6 DSA’s application in the context of OCSSPs. This exact issue will resurface below when we examine due diligence obligations. There may, however, exist good arguments for not taking direct references to the DSA’s liability exemptions as evidence for precluding their applicability, which we explore in detail below.Footnote 107
In the specific context of Article 6 DSA, in any case, the applicability on OCSSPs is further complicated: Article 6 DSA is meant to enable “activities aimed at detecting, identifying and removing, or disabling of access to, illegal content, or take the necessary measures to comply with the requirements of Union law, including those set out in this Regulation”. But Articles 17(4)(b) and (c) CDSM Directive already set forth a liability-exemption mechanism requiring OCSSPs make best efforts to apply preventative measures to ensure the unavailability or removal of copyright-infringing content. These specific rules for OCSSPs would appear to leave little space for voluntary own-initiative investigations by online platforms and consequently the application of Article 6 DSA. As a result, there may be no need to look for interpretations that would include voluntary activities by OCCSPs.Footnote 108
Yet it is conceivable that certain voluntary measures by OCSSPs could go beyond the required “best efforts” and would therefore not trigger liability, provided they are within the limits imposed by Articles 17(7)–(9) CDSM Directive. This is particularly true in light of the different natures of the instruments at issue (Regulation versus Directive) and the potential for diverging national implementations and interpretations of the concept of “best efforts” in Article 17(4), as already manifested during the implementation process.Footnote 109 This problem of multi-layered, geographically dispersed enforcement is not necessarily solved by the EC’s Guidance either. For instance, when discussing best efforts to obtain an authorisation (a precondition for the liability exemption) in Article 17(4)(a), the EC identifies scenarios, sectors and players in relation to which OCSSPs must proactively seek a license or react to offered licenses. But despite copyright being a territorial right, the geographical scope of a platform’s obligation is far from clear. What is more, the Commission falls back to the conclusion that this obligation should be assessed on a “case-by-case” basis.Footnote 110 In other words, there is a likelihood for legal uncertainty arising out of the diverse national implementations and practices of preventative measures by platforms for copyright-infringing content pursuant to the Directive as compared to the horizontal EU-wide obligations stemming from the DSA. Given the potential complexity arising out of this legal puzzle, it would be important to clarify this relationship during the legislative process. This point has practical consequences for how platforms can and should design their content-moderation systems in light of both the DSA and the CDSM Directive.
Finally, the rules on orders against illegal content and orders to provide information in Articles 8 and 9 DSA may apply to OCSSPs. Article 8 DSA, in particular, sets out a detailed regime not available elsewhere to OCSSPs. To be sure, one could argue that Article 8(3) InfoSoc Directive, as interpreted by the CJEU, already provides specific rules on injunctions. But the latter provision applies only to “intermediaries whose services are used by a third party to infringe a copyright or related right”, a rule that is consistent with Article 14(3) e-Commerce Directive.Footnote 111 In other words, Article 8(3) InfoSoc Directive applies to intermediaries that are not directly liable for the content they host. This is not the case for OCSSPs, who by virtue of the legal regime in Article 17(1) CDSM Directive are directly liable for the content they host and that is publicly available. If this is the case, then it would seem that Article 8 DSA applies to OCSSPs, opening the thorny question as to the applicability to this new reality of the extensive CJEU case law on the intersection between fundamental rights, copyright enforcement against intermediaries and the prohibition on general monitoring.Footnote 112
b. What are the due diligence obligations for OCSSPs?
It is beyond the scope of this article to discuss in depth all potential obligations that apply to online platforms and VLOPs in the proposed DSA. Instead, we will focus on selected key obligations that apply to both categories and might be relevant for OCSSPs. This includes certain due diligence obligations for all providers of intermediary services (Articles 10–13), online platforms (Articles 14–24) and VLOPs (Articles 25–33).Footnote 113 In our view, these are also likely to be in practice some of the provisions that may impose additional obligations on providers subject to other sector-specific rules, such as video-sharing platforms in the AVMSD and hosting service providers in the Terrorist Content Regulation.
As a preliminary remark, we see no obstacle to the application to OCSSPs of general obligations Footnote 114 that extend to all intermediary services on points of contact, legal representatives, terms and conditionsFootnote 115 and transparency reporting. This includes the obligations set out in Articles 10–13 (with aggravation in Articles 23 and 33 DSA). Furthermore, since Article 17 CDSM Directive focuses on the disabling of illegal information and not the recommendation or promotion of information, the relevant rules in the DSA on recommender systems (Article 29) should also fully apply to OCCSPs.Footnote 116 This conclusion, we note, would be valid for other sector-specific legislative instruments in EU law that regulate only certain content-moderation activities by service providers (eg concrete aspects of notice-and-action) but not recommender systems.
As noted in Section I, private ordering via terms and conditions and automated content-moderation systems is a crucial component of platform power, especially as concerns Big Tech platforms. In this regard, Article 12 DSA is particularly noteworthy.Footnote 117 This provision applies to all intermediary service providers: it aims to increase the transparency of intermediaries’ terms and conditions and to bring their enforcement into direct relation with fundamental rights. In the EC’s proposal, Article 12(1) DSA imposes an information obligation regarding restrictions imposed on users of intermediary services, and this obligation extends to algorithmic decision-making. Article 12(2) DSA then introduces an apparently broad obligation for providers to act in a diligent, objective and proportionate manner when applying and enforcing such restrictions, explicitly linked to the respect of fundamental rights. Furthermore, the provision expands the scope of the obligations beyond illegal content, applying also to content that intermediaries consider harmful or undesirable in their terms and conditions. These horizontal obligations for all providers of intermediary services are welcome, especially as a means to curtail the private ordering power of Big Tech platforms (particularly VLOPs) as well as less visible intermediaries.Footnote 118 However, because the obligations appear too vague as to be effective, there are doubts as to whether this provision will be relevant to curtailing the power of platforms in defining the terms of their relationships with users, including how they operationalise their algorithmic content-moderation systems.Footnote 119 Despite these shortcomings, there is undoubtedly some value added in the application of Article 12 DSA to OCSSPs. The reason for this is that Article 17 CDSM Directive is remarkably thin in this respect. In fact, Article 17(9) merely requires that OCSSPs inform users in their terms and conditions of the user’s right to use works under exceptions, with the EC Guidance adding precious little in this respect.Footnote 120
i. Notice-and-action and statement of reasonsFootnote 121
A trickier question is whether or not the detailed regimes on notice-and-action (Article 14 DSA) and statement of reasons (Article 15 DSA) are suggested to apply to OCSSPs.
As explained above, Articles 17(4)(b) and (c) CDSM Directive set out a specific notice-and-action regime, which includes in paragraph (c) obligations regarding notice-and-takedown as well as notice-and-stay-down.Footnote 122 This could point in the direction of the DSA being excluded here, since the copyright-sector regulation contains rules on the matter. At the same time, however, Article 17 CDSM Directive remains vague around the concrete notice-and-action setup: it merely mentions “a sufficiently substantiated notice” that must originate from rights holders.Footnote 123 In a vacuum, this would, for instance, allow Member States a margin of discretion in regulating the details of such notices. In that line, the recent EC Guidance on Article 17 advances concrete recommendations on the content of such notices, most notably that they follow the 2018 Recommendation on Measures to Effectively Tackle Illegal Content Online.Footnote 124
Thus, it is also arguable that some components of the notice-and-action regime, such as the minimum elements that should be contained in a notice to a platform,Footnote 125 add a level of specificity not found in the lex specialis rules of the CDSM Directive.Footnote 126 Then again, already today the European landscape for notices is varying, since some Member States chose to amend the implementation of the hosting liability exemption in Article 14 e-Commerce Directive with procedural rules whereas others did not. On this point, it is important to remember that the very choice of instrument for the DSA – a Regulation vis-à-vis a Directive – was considered necessary to provide legal certainty, transparency and consistent monitoring.Footnote 127 Furthermore, the accompanying Explanatory Memorandum points out that sector-specific instruments do not cover all regulatory gaps, especially with regards to “fully-fledged rules on the procedural obligations related to illegal content and they only include basic rules on transparency and accountability of service providers and limited oversight mechanisms”.Footnote 128 Similarly, Article 1(2)(b) DSA notes that the aim of the Regulation is to set out uniform rules. All of these considerations suggest the application of DSA rules to OCSSPs.
Against this application, the strongest argument we find lies with the nature of the legal instrument and the consideration that the rationale for the vaguer regime of Article 17 CDSM Directive in this regard was precisely to allow some margin of discretion to platforms and rights holders on how to define the content of notices for the specific subject matter of copyright. In that line, such a margin would be more adequately filled by national implementations pursuant to the recommendations of the EC Guidance rather than by application of the DSA.Footnote 129
But there is an inherent tension (if not a contradiction) in this argument between allowing for the margin of discretion at the national level inherent to the nature of a Directive and the desire to claw back much of that discretion via the EC’s extensive Guidance on Article 17. In fact, the Guidance aims not only at a legally “coherent and consistent” transposition of the provisions across the EU, but also at assisting “market players” in complying with national laws in this area.Footnote 130 To this effect, for instance, the EC identifies standards for content-recognition tools for different types of providers, incentivises the standardisation of reporting information, encourages the development of registries of rights holders and protected content and establishes rules and thresholds for what types of content may and may not be subject to filtering measures.Footnote 131 In other words, the Guidance enables both a much more uniform implementation of Article 17 obligations by Member States and allows OCSSPs – especially those larger platforms that qualify as VLOPs – to provide identical offers across the EU in compliance with Article 17. For instance, in the case of YouTube, it would be more sensible to adjust its EU-wide services and systems (eg ContentID, Copyright Match Tool and Webform) to apply consistently on a cross-border basis and to ensure compliance with the most developed and sophisticated national implementation of Article 17, which would likely be German law.Footnote 132 The important consequence of this development, for our purposes, is that it facilitates an alignment of the horizontal DSA rules, particularly those applicable to VLOPs, with sector-specific copyright rules, going some way towards addressing the multi-layered enforcement problem arising from the overlapping obligations for platforms stemming from a Directive versus a Regulation.
In any case, the definite answer to the question on the application of Article 14 DSA to OCSSPs also depends on the legal nature of the provision: is it to be understood as a supplement to the specific hosting liability exemption in Article 5 DSA or as a due diligence obligation applicable to hosting services more broadly? On the one hand, it is clear that due diligence obligations are to be seen as separate from liability exemptions. The (non-)compliance with due diligence obligations does not affect the hosting safe harbour and vice versa. On the other hand, this distinction between safe harbours and due diligence obligations is blurred by the – we think problematic and probably unintended – effect that a notice is suggested by the Commission’s proposal to have on the actual knowledge of a hosting service.Footnote 133 Since Article 14(3) DSA makes direct reference to the hosting liability exemption in Article 5 DSA, at least paragraph (3) of Article 14 DSA may not directly apply to OCSSPs.
A similar reasoning applies to the rules on statements of reasons (Article 15 DSA), which apply to the justification provided by platforms to users regarding decisions to remove or disable access to specific items of information. In the scheme of Article 17 CDSM Directive, users appear to be informed about these reasons only through a complaint and redress mechanism. Under Article 17(9), rights holders “shall duly justify the reasons for their [removal] requests” to OCSSPs, who will then make a decision on removal or disabling. There are no explicit rules on whether, when and how these decisions are communicated to users, which suggests that there is ample margin for the application of the specific rules set out in Article 15 DSA. In practice, this would mean that platforms would have to extend their reporting systems for other types of illegal content also to copyright-infringing content in this regard.
ii. Internal complaint mechanism and out-of-court dispute settlement process
In the context of online platforms, Articles 17 and 18 DSA set forth a detailed internal complaint mechanism as well as an out-of-court dispute settlement process. Article 17 CDSM Directive also mandates such mechanisms in paragraph (9) for the specific genus of OCSSPs but in a much less detailed fashion. In various forms, both the DSA and Article 17 CDSM Directive stipulate that such internal complaint mechanisms need to be effective, to be processed within a reasonable timeframe (undue delay/timely manner) and to involve some form of human review. The DSA, however, is more detailed, and it includes, for instance, a requirement of user-friendliness and a minimum period for filing such complaints of six months following the takedown decision.
Thus, the question is again whether the DSA is able or intended to “fill” the holes that the lex specialis regulation in the CDSM Directive left open. First, however, even if answered in the negative, it could be argued that Articles 17 and 18 DSA – in the view of the EU lawmaker – represent the archetypes of “effective and expeditious” mechanisms. Complaint and redress mechanisms should therefore be modelled after the horizontal DSA example where the CDSM Directive falls short. In our view, this is a normatively desirable outcome in line with the aims of the DSA.Footnote 134
Second, we should not forget that OCSSPs are not relevant from a copyright perspective only. If a video on YouTube contains illegal hate speech, the notice-and-action mechanism (and following redress mechanisms) would not fall under the regime of Article 17 CDSM Directive, but rather that of the e-Commerce Directive and the future DSA.Footnote 135 Having various similar but different redress mechanisms for the very same platform depending only on the legal regime governing the content at issue (copyright, personal data, hate speech, etc.) can hardly be in the interest of the lawmaker, Footnote 136 OCSSPs, Internet users or other stakeholders. This strongly argues in favour of the application of the DSA rules consistently to all platforms.
This is especially true for Big Tech platforms, who have developed complex complaint and redress mechanisms as part of their content-moderation systems for different types of illegal and harmful content, with the result of obfuscating users’ ability to obtain effective redress for their complaints.Footnote 137 In particular for copyright-infringing content, the systems put in place by larger platforms such as YouTube have meant in practice that complaint and redress mechanisms are rarely used by users.Footnote 138 In this regard, it is noteworthy that Article 17 CDSM Directive lacks some of the DSA safeguards, such as the presence of a body like the Digital Services Coordinator and a clear obligation of reinstating content in Article 17(3) DSA as a countermeasure to over-blocking.Footnote 139 Combined with a limitation of filtering measures to “manifestly infringing content” endorsed by the EC Guidance on Article 17 and the AG in Case C-401/19,Footnote 140 these overlapping obligations of the DSA and the CDSM Directive would better protect the freedom of expression of users by influencing platforms’ design of these mechanisms, adding external oversight and increasing users’ in-platform redress avenues.Footnote 141
A counterargument would be that such a differentiated approach is justified in light of the specific character of the rights concerned. The question then is: what part of substantive copyright law would prescribe a different treatment for the complaint handling of copyright-related content takedowns? The immediate starting point for there being such a special place for copyright in the heart of the EU acquis would be its protection in Article 17(2) Charter of Fundamental Rights of the European UnionFootnote 142 and the high-level of protection as set out in the recitals of the InfoSoc Directive and as emphasised time and again by the CJEU.Footnote 143 In our view, however, such high-level protection can hardly be undermined by safeguarding complaint mechanisms. These complaint mechanisms only become relevant once content has been taken down and a potential infringement of the protected rights is prevented. Instead, redress mechanisms relate inter alia to users’ fundamental rights (vis-à-vis a platform’s right to conduct a business). Consequently, we argue that Articles 17 and 18 DSA should apply to OCSSPs to fill the gaps left open by the vaguer rules on complaint and redress in Article 17(9) CDSM Directive. As noted, this would have the result of forcing platforms that qualify as OCSSPs and VLOPs to align their copyright redress mechanisms with their remaining illegal content-moderation systems covered by the DSA, thereby increasing their level of procedural ex post safeguards in this area.
iii. Trusted flaggers/notifiers and measures against misuse
Another noteworthy novelty relates to the obligation for online platforms to collaborate with certain trusted flaggers/notifiers in Article 19 DSA. A trusted notifier is “an individual or entity which is considered by a hosting service provider to have particular expertise and responsibilities for the purposes of tackling illegal content online”.Footnote 144 Despite the regime in Article 17 CDSM Directive, we expect trusted flaggers to play an important role on OCCSPs for the flagging of copyright-protected material in the foreseeable future. In fact, in the context of larger OCSSPs, trusted flaggers/notifiers already play a crucial but often opaque role in the privatisation of online content (copyright and other) moderation and enforcement.Footnote 145
Recital 46 DSA, for example, notes that for “intellectual property rights, organisations of industry and of right-holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions”. Once again, however, Article 19(1) DSA puts itself in direct connection to the notice-and-action mechanism in Article 14 DSA, meaning that this regime could be related to only those online platforms that are not OCSSPs. Thus, the applicability of Article 19 in the context of OCSSPs depends at least to some extent on the question as to whether the notice-and-action mechanism applies to OCSSPs, as discussed above.Footnote 146
In the field of copyright and OCSSPs, rights holders may have an interest in online platforms being obliged to collaborate with certain trusted notifiers. Already today, however, trusted flagger arrangements are common occurrences, at least on larger-scale online platforms such as YouTube or Facebook.Footnote 147 The notable twist of the DSA is that trusted flagger status is awarded by the relevant Digital Services Coordinator of the Member States if certain requirements are met.Footnote 148 Furthermore, the platform is obliged to inform the Coordinator if a trusted flagger submits “a significant number of insufficiently precise or inadequately substantiated notices”.Footnote 149 Ultimately, then, the trusted flagger status can be revoked.Footnote 150 In light of uncertainty around the data quality of copyright notices, such oversight could also be of particular importance in the context of OCSSPs.Footnote 151
But even if Article 19 DSA indeed were not applicable to OCSSPs, it is important to note that already the non-binding Recommendation (EU) 2018/334 on measures to effectively tackle illegal content online encourages platforms to voluntarily collaborate with trusted flaggers.Footnote 152 Similarly, nothing in the DSA prevents “voluntary” trusted notifier arrangements. However, these would be outside the scope of Article 19 and therefore outside the supervision of the Digital Services Coordinator.Footnote 153 This apparent gap is, however, at least partly tackled by Article 20 DSA.
Article 20 DSA on measures and protection against misuse contains two main angles: first, the obligation to suspend the accounts of users who “frequently provide manifestly illegal content”Footnote 154 ; and second, the obligation to suspend the processing of notices and complaints by individuals or entities or by complainants who “frequently submit notices or complaints that are manifestly unfounded”.Footnote 155 In our view, the Article 20 proposal is central to mitigating misuse both by users and by any type of flagger, probably excluding at least partly “trusted flaggers” (regulated by Article 19), but including flaggers covered by “voluntary” trusted notifier arrangements with platforms.
Again, Article 20(2) DSA, however, directly references Articles 14 and 17 DSA. Thus, for the application of Article 20 to OCSSPs once again the central question is whether Article 14 and (at least part of) Article 17 DSA apply the lex specialis of Article 17 CDSM Directive.
The issue of users repeatedly uploading illegal content is as relevant for OCSSPs as it is for other online platforms. Likewise, the misuse of notices and complaints is a concern on OCSSPs. Articles 17(7) and (9) subparagraph 3 CDSM Directive require that the copyright regime must not lead to the unavailability of non-infringing works without, however, explicitly putting in place any protection against misuse. In this absence of specific regulation, we argue that Article 20 DSA should be fully applicable to copyright misuse. This provision is also central for voluntary arrangements (eg trusted notifiers falling outside the regime set forth in Article 19 DSA), for which we equally argue that it is fully applicable. For reasons of legal certainty, it is desirable that the wording of Article 20 DSA is clarified during the legislative process to state this unequivocally.Footnote 156
iv. Additional obligations on VLOPs
Finally, VLOPs are subject to certain specific due diligence obligations inter alia risk assessment (Article 26 DSA) and risk mitigation (Article 27 DSA).Footnote 157 The functioning and use made of the services of very large OCSSPs (eg YouTube, Facebook and Instagram) might come with systemic risks, such as “dissemination of illegal content” (including copyright infringement) or “negative effects for the exercise” of fundamental rights (including freedom of expression). Since the CDSM Directive in no way addresses these issues, we do not see any argument that precludes the application of Articles 26 and 27 DSA (as well as other relevant provision such as data access) to VLOPs that are also OCSSPs.Footnote 158 The same reasoning holds for other relevant obligations, such as data access and transparency.Footnote 159
IV. Conclusions
In this article, we have looked at the (potential) relationship between the horizontal DSA rules and the sector-specific rules for OCSSPs in Article 17 CDSM Directive from a legal doctrinal perspective. Rules on copyright – vis-à-vis other forms of information (or content) – appear to have a special place in the EU legal order. Meanwhile, the EC has provided (internally) some insights on their interpretation in a presentation to the Council Working Party on Intellectual Property (Copyright).Footnote 160 In that presentation, the Commission reminds us that the “DSA is not an IPR enforcement tool” given its general and horizontal nature, but that it “includes a full toolbox which can be very useful for the enforcement of IPR [intellectual property rights]”, which would apply “without prejudice to existing IPR rules”. Notably, however, the Commission considers that Article 17 CDSM Directive remains “unaffected; i.e., DSA rules on limited liability, notice and action, redress and out of court mechanism [are] not applicable for [OCSSPs]”. Our analysis of the DSA proposal leads to a different conclusion, painting a more complex picture.
In our view, the reference in the DSA to “unaffected” does not mean its horizontal rules would not supplement those in Article 17 CDSM Directive, especially as it regards notice-and-action or redress mechanisms.Footnote 161 Rather, on the basis of the available proposal and the amendments thus far, we argue that the DSA will probably apply to OCSSPs insofar as it contains: (1) rules that regulate matters not covered by Article 17 CDSM Directive; and (2) specific rules on matters where Article 17 leaves a margin of discretion to Member States.
Category (1) applies to some provisions in the liability framework rulesFootnote 162 of the DSA and most clearly to procedural obligations. This makes sense since, in our view, the special role of copyright, as noted above, may only be related to substantive copyright law. But the DSA’s due diligence obligations we have examined relate to information requirements, quality assurances regarding notices and procedural safeguards for ex post control with a view to, for instance, reinstating non-infringing content. In this light, we find no strong argument for why EU copyright law would require a full exemption from procedural obligations set out for online platforms in the DSA. In fact, the very character of the proposed DSA (and the e-Commerce Directive that precedes it) is to provide broad and horizontal rules for a level playing field. Where no more specific regulation of Article 17 CDSM Directive applies, the asymmetric due diligence obligations of the DSA should apply.
The situation is trickier for Category (2), which relates to areas where Article 17 CDSM Directive does in fact provide for some degree of regulation, and the extent to which it pre-empts more detailed rules in the DSA is uncertain. The situation is further complicated by the Commission’s Guidance on Article 17, despite its non-binding character. In our view, in any case, the logical approach appears to be to consider the CDSM Directive’s regulation as lex specialis. However, where this lex specialis does not contain specific or more detailed regulation (or an explicit exemption from the general rules), the horizontal rules of the DSA would apply once it comes into force.Footnote 163 This is despite the different natures of the legal instruments at issue (Directive versus Regulation), the territorial nature of copyright and the potential issues arising therefrom according to the perspective of multi-layered enforcement. These problems may be attenuated by the harmonising effect of the EC Guidance on Article 17 on Member States’ laws and OCSSP practices.
From a normative standpoint, we understand the DSA’s due diligence obligations as “first principles” of how Internet intermediaries – and most notably platforms and VLOPs – must “behave”, and how the competing fundamental rights of the involved parties can be balanced. In other words, the DSA’s due diligence obligations should be viewed as the horizontal fall-back regime that would only be altered by more specific lex specialis rules. That is to say, as a horizontal framework, the DSA sets out the default legal regulation for the intertwined relations of platforms, users and rights holders.Footnote 164 As such, even in the presence of specific non-exhaustive sector regulation, the DSA rules should remain applicable unless they are clearly set aside by the lex specialis.
In this light, our analysis identifies several rules in the DSA proposal that should apply to OCSSPs despite the regime in Article 17 and the accompanying Guidance: on notice and action, internal complaint and out-of-court dispute settlement, trusted flaggers/notifiers and measures against misuse. But we have also identified a number of grey areas in these overlaps between the DSA and Article 17 CDSM Directive. To avoid legal uncertainty, it would be important to clarify these during the legislative process, thereby mitigating the risks associated with multi-layered enforcement on OCSSPs. This could be achieved, for instance, by stating that Chapter III DSA (Articles 10–37) applies as a horizontal framework mutatis mutandis also to those intermediary services covered by other secondary legislation, to the extent that no more specific rules are laid out. Further precise clarifications could be introduced in the specific grey areas identified in our analysis in order to ensure the applicability of the DSA’s safeguards to OCSSPs, where justified. After all, although we can all agree that copyright is special, it should not be a barrier to setting “uniform rules for a safe, predictable and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected”.Footnote 165 In this respect, although our analysis focuses on the intersection of the DSA with copyright law, as we have noted throughout, our analytical framework could prove useful to further research and to clarifying the overlap of other sector-specific rules on different types of online platforms – such as in the Terrorist Content Regulation and the AVMSD – with the DSA. It could also serve as a reminder and blueprint for future national and EU legislative endeavours in the area of platform regulation to carefully consider their interplay with the DSA.
Acknowledgements
The authors wish to thank Alexander Peukert, Felix Reda, Christoph Schmon, Nuno Sousa e Silva and Jens Schovsbo for their valuable comments. All errors remain ours.
Financial support
This research is part of the reCreating Europe project, which has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No. 870626. João Pedro Quintais’s research in this article is also part of the VENI Project “Responsible Algorithms: How to Safeguard Freedom of Expression Online” funded by the Dutch Research Council (grant number: VI.Veni.201R.036).
Competing interests
The authors declare none.