On December 15, 2020, the European Commission (EC) presented its long-awaited proposal for a Digital Services Act (DSA), together with a proposal for a Digital Markets Act (DMA), which we discussed in a previous alert. Whereas the DMA aims to promote competition by ensuring fair and contestable markets in the digital sector, the DSA proposal intends to harmonize the liability and accountability rules for digital service providers in order to make the online world a safer and more reliable place for all users in the EU.

Most notably, the DSA would impose far-reaching due diligence obligations on online platforms, with the heaviest burdens falling on “very large” online platforms (i.e., those with more than 45 million average monthly active users in the EU), due to the “systemic” risks such platforms are deemed to pose in terms of their potential to spread illegal content or to harm society. In this day and age when the perceived power of online platforms to independently control content publication and moderation is headline news daily, with governments throughout the globe grappling with different legislative and regulatory proposals, the DSA stands out as an ambitious effort by the EC to create a consistent accountability framework for these platforms, while striking a balance between safeguarding “free speech” and preserving other values and interests in a democratic society. Like the parallel DMA proposal, the DSA proposal has been criticized for targeting mainly U.S.-based companies, which would make up most of the “very large” platforms. Given the huge commercial interests at stake, the passage of both laws will no doubt be the subject of intense debate and lobbying, including with respect to the asymmetric nature of the proposed regulation and the powerful role that the EC reserves to itself in both proposals.

Digital Services Act Proposal: Main Takeaways

  • Aims to promote a safe and trustworthy online environment for all users (business users and consumers);
  • Applies to all so-called “intermediary services” (including “mere conduit,” caching and hosting services) provided to recipients who have their place of residence or business in the EU, regardless of whether the providers of those services are established in the EU;
  • Largely reproduces the exemptions from liability for the transmission and storage of illegal content by intermediary services providers set out in the current e-Commerce Directive (which has applied since 2000);
  • Requires all providers of intermediary services to have a single point of contact for authorities and transparent content moderation policies, and requires non-EU providers to appoint a legal representative in the EU;
  • Sets out due diligence obligations for providers of hosting services, online platforms, and “very large” online platforms (each of these categories is a subcategory of the one that precedes it and must comply with additional obligations);
  • Obliges all providers of hosting services to establish a mechanism for reporting illegal content and to state the reasons for content moderation decisions;
  • Imposes obligations on all but the smallest of online platforms to set up complaint and redress mechanisms and out-of-court dispute settlement mechanisms, to cooperate with “trusted flaggers,” to take measures against abusive notices, to ensure the traceability of business users (“know your business customer”), and to provide user-facing transparency of online advertising;
  • Imposes additional obligations on very large online platforms, reaching 45 million average monthly users or more, due to the “systemic” risks they are deemed to pose in terms of their potential to spread illegal content or harm society, including obligations to meet risk management obligations, submit to external auditing, provide transparency of their content recommendation and ad targeting mechanisms, and share data with authorities and vetted independent researchers;
  • Entrusts primary responsibility for enforcement to national authorities dubbed “Digital Service Coordinators,” while allowing the European Commission to intervene against non-compliance by very large online platforms, with the power to impose fines of up to 6% of worldwide annual turnover.

Who Will Be Regulated?

The DSA would apply to all so-called “intermediary services” provided to recipients who have their place of residence or business in the EU, regardless of whether the providers of those services are established in the EU.
Intermediary services” are defined to include “mere conduit,” “caching,” and “hosting” services:

  • Mere conduit” services are defined as services consisting of the transmission of information in, or the provision of access to, a communication network, and include the services of telecommunications operators and internet access providers.
  • A “caching” service is defined as the transient storage of information for the sole purpose of making more efficient the transmission of that information in a communication network.
  • A “hosting” service is defined as a service consisting in the (non-transient) storage of information provided by, and at the request of, the recipient of that service. This includes web hosting companies, cloud storage providers, and various other online platforms. As further discussed below, the DSA defines a subcategory of “online platforms” within this broader category of hosting services.

Liability Exemptions

As proposed, the DSA builds on, but does not replace, the e-Commerce Directive, which has applied since 2000. More specifically, it largely reproduces the exemptions from liability for the transmission or storage of illegal content set out in Articles 12 to 15 of the Directive (which are to be deleted from the e-Commerce Directive itself if the DSA is adopted). Caching and hosting providers must act expeditiously to remove or disable access to content as soon as they become aware of its illegality.

The DSA proposal does not require service providers to monitor the information transmitted or stored, nor to actively investigate suspected illegal activity, but if they do, this will not make them ineligible for the exemptions. However, the exemption for hosting services will not apply to online platforms that allow consumers to conclude e-commerce transactions with traders where the consumer reasonably believes that the information, product, or service that is the object of the transaction is being offered by the platform itself, or a trader under its control.

Due Diligence Obligations

The DSA proposal sets out due diligence obligations for providers of four categories of online services: intermediary services, hosting services, online platforms, and “very large” online platforms. Each of these categories is a subcategory of the one that precedes it and must comply with additional obligations.

All providers of intermediary services must comply with the following obligations to establish a single point of contact for electronic communication with the EC and the competent authorities of the Member States, as well as to disclose, in their terms of use, any policies, procedures, measures, and tools used for content moderation, including algorithmic decision-making and human review. Service providers active within the EU, but that are not established in the EU, must designate a legal representative in one of the Member States in which they offer services, who can then be held liable for non-compliance with the obligations laid down in the DSA. In addition, providers of intermediary services must publish, at least once a year, clear and detailed reports on their content moderation activities for the relevant period.

For providers of hosting services there are two additional obligations:

  • to establish a mechanism through which users can notify them of suspected illegal content, and
  • to inform users when they remove their content and to clearly state the reasons for such removal.

An online platform is defined as a hosting service which stores information and disseminates it to the public (e.g., social networks, video-sharing websites, etc.). With the exception of “micro and small enterprises” (as defined in the Annex of Recommendation 2003/361/EC), online platforms would be made subject to several additional obligations, including:

  • setting up an internal complaint-handling mechanism so users can object to content moderation decisions or to the suspension or termination of their account;
  • allowing users to challenge such decisions before an out-of-court dispute settlement body certified by the Digital Services Coordinator (DSC) of the relevant Member State (more about DSCs below);
  • giving priority to notices of illegal content submitted by trusted flaggers, i.e., entities recognized by the DSC of the Member State where they are established, based on their particular expertise and competence for detecting, identifying, and notifying illegal content, and who are independent from any online platform;
  • taking measures against misuse of the platform, such as suspending the provision of services to users who frequently provide manifestly illegal content or suspending (for a reasonable period of time and after having issued a prior warning) processing notices and complaints by users who often submit notices or complaints which are manifestly unfounded;
  • promptly reporting suspicion of serious criminal offences to the authorities;
  • ensuring the traceability of traders active on their platform (“know your business customer”);
  • complying with additional reporting obligations, by including in their periodic content moderation report information (i) on the number and outcome of disputes referred to out-of-court dispute settlement bodies; (ii) on the number of times services to users have been suspended, whether for misuse of the platform or for submitting unfounded complaints; and (iii) on the use and parameters of any automated content moderation systems;
  • providing transparency in online advertising, by ensuring that, for each specific advertisement displayed to each individual user of the platform, it is made clear that the information displayed is an advertisement, on whose behalf the advertisement is displayed, and what main parameters are used to target the advertisement.

Very large online platforms, defined as platforms reaching 45 million average monthly users or more (corresponding to roughly 10% of the EU-27 population), are subject to further obligations, in particular obligations to:

  • conduct risk assessments on the systemic risks brought about or relating to the functioning and use of their services;
  • take reasonable and effective measures to mitigate those risks;
  • submit to annual compliance audits;
  • appoint one or more compliance officers;
  • publish transparency reports every six months, with additional reporting obligations (including setting out the results of the risk assessments and describing the related risk mitigation measures);
  • provide access to data to the DSCs, the EC, and vetted independent researchers;
  • if they use “recommender systems” (essentially, algorithms to determine in an automated way the relative prominence of information displayed to users), provide transparency regarding the “main parameters” used by those systems;
  • comply with additional transparency obligations in relation to advertising, by compiling and making publicly available through APIs a repository containing (for each advertisement displayed on the platform and until one year after it was displayed), information including the content of the advertisement, on whose behalf it was displayed, and, if it was targeted at specific user groups, the main parameters used for that purpose.

Enhanced Enforcement

Enforcement of the DSA would primarily rest with so-called “Digital Services Coordinators” (DSCs), i.e., the national competent authorities which the Member States would be required to appoint for that purpose. However, the proposal also provides for enhanced cooperation and coordination between the Member States, through the establishment of a European Board for Digital Services made up of all the individual Member State DSCs. As regards very large online platforms, the EC would have the power to intervene directly. The EC would be empowered to adopt interim measures, make binding commitments, adopt non-compliance decisions, and impose fines of up to 6% of the global annual turnover of a service provider. Private enforcement through national courts would also be possible.

The road ahead

The DSA proposal is subject to the ordinary EU legislative procedure, meaning that it will undergo a lengthy negotiation process between the European Parliament and the Council, which is expected to last at least 1.5 years. Once adopted, the DSA is expected to take effect three months after its publication. As an EU regulation, the DSA would be directly applicable across the EU.