European Union: Commission Publishes Legislative Proposal to Regulate Digital Platforms

(Feb. 4, 2021) On December 15, 2020, the European Commission published two legislative proposals to regulate digital platforms: a proposal for a Regulation on a Single Market for Digital Services (Digital Services Act, DSA) and a proposal for a Regulation on Contestable and Fair Markets in the Digital Sector (Digital Markets Act, DMA). If adopted, the DSA would introduce new due diligence obligations for all businesses providing digital services in the European Union (EU); however, obligations would be on a sliding scale depending on the type of service provided. The DMA would establish new rules for “digital gatekeepers,” meaning companies that wield a lot of market power, determined by quantitative criteria such as turnover, market capitalization, and number of users.

The proposals complement the Commission’s European Digital Strategy, which sets a vision for the Commission to become a “digitally transformed, user-focused and data-driven administration” by 2022. (European Digital Strategy at 2.)

An EU regulation is directly applicable in the EU member states once it enters into force and does not need to be transposed into national law. (Consolidated Version of the Treaty on the Functioning of the European Union (TFEU) art. 288, para. 2.)

Content of the Digital Services Act (DSA)


The goal of the DSA is to “set out uniform rules for a safe, predictable and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected.” (DSA art. 1.) The act sets out due diligence obligations for covered services, which would apply on a sliding scale depending on the type of service, and lays out rules on implementation and enforcement of these obligations. Rules on liability of intermediary services and exemptions from liability are also included; however, they are unchanged from the current legal situation. The DSA would not harmonize the definition of illegal content, but harmonize the procedures. The definition of illegal content would remain subject to member state law.

Covered Services

The DSA lays down rules for intermediary services that provide services to recipients in the EU irrespective of the place of establishment of the provider. (Art. 1, para. 3; arts. 10–13.) Intermediary services are defined as mere conduit services, caching services, or hosting services. (Art. 2(f).) Additional obligations are established for hosting services, online platforms, and very large online platforms. (Arts. 14–33.) An online platform means “a provider of a hosting service which, at the request of a recipient of the service, stores and disseminates to the public information … ,” meaning the information is made available to a potentially unlimited number of third parties. (Art. 2(h) & (i).)

Common Obligations for Intermediary Services

All intermediary services would be required to designate a single point of contact and appoint a legal representative in the EU if they are based outside of the EU. (Arts. 10, 11.) Their terms and conditions would need to clearly state any restrictions that might be imposed on users, and they would need to be transparent about how and when illegal content or content that is contrary to their terms and conditions might be removed or disabled. (Arts. 12, 13.)

Additional Obligations for Hosting Services

In addition to the obligations set out in section 1 of the DSA, hosting services would also have to fulfill additional obligations with regard to the removal of alleged illegal content. Third parties must have easy access to a user-friendly electronic mechanism where they can alert the provider of alleged illegal content. Users whose content is removed or disabled must be provided with the reasons for doing so. (Arts. 14, 15.)

Additional Obligations for Online Platforms

Online platforms would be obligated to comply with further obligations than the ones mentioned above. (Arts. 17–24.) They must establish an internal complaint-handling system and engage with certified out-of-court dispute settlement bodies to resolve user disputes. If a “trusted flagger” submits a notice of removal, it must be treated with priority. (Art. 19.) Furthermore, the DSA enumerates measures against misuse of the online platform that must be adopted, such as temporarily suspending user accounts that frequently post manifestly illegal content. Facts and circumstances for making such determinations and the duration of the account suspension must be clearly delineated in the terms and conditions. (Art. 20.) Online platforms that qualify as micro or small enterprises would be excluded from the additional obligations. (Art. 16; Recommendation 2003/361/EC, Annex.)

Additional Obligations for Very Large Online Platforms

Lastly, very large online platforms would be obligated to adhere to additional rules to mitigate the systemic risk stemming from the dissemination of illegal content through their services; from any negative effects for the exercise certain fundamental rights, such as respect for private and family life; and from intentional manipulation of their service. In particular, they would need to conduct risk assessments, take reasonable and effective measures to mitigate those risks, perform external and independent audits, and appoint a compliance officer, among others. Special rules are in place for the use of recommender systems and online advertising to better inform users why certain content is shown. (DSA arts. 26–33.) Very large online platforms are defined as online platforms that provide their services to 45 million or more average monthly active users in the EU. (Art. 25, para. 1.)

The enforcement section of the DSA also contains specific requirements for enhanced supervision of very large online platforms in the event they violate the obligations set out above. In addition, the European Commission would be authorized to intervene in case the infringement persists. The Commission would be provided with numerous possibilities to force compliance of the platform, including noncompliance decisions, fines, and periodic penalty payments. (Arts. 50–66.)

Content of the Digital Markets Act (DMA)

The DMA would be more limited in scope than the DSA and would apply only to digital “gatekeepers.”

Definition of Gatekeeper

A gatekeeper must fulfill the following quantitative criteria:

  • It has a significant impact on the internal market, meaning an annual European Economic Area (EEA) turnover equal to or more than 6.5 billion euros (about US$7.9 billion) in the last three financial years, a market capitalization of at least 65 billion euros (about US$79.1 billion) in the last financial year, or a presence in at least three EU member states;
  • Operates a core platform service, such as search engines, social networking services, video-sharing platform services, or cloud computing services, which serves as an important gateway for business users to reach end users, meaning the service must have more than 45 million monthly active end users in the EU and more than 10,000 yearly active business users; and
  • Enjoys an entrenched and durable position in its operations or foreseeably will enjoy such a position in the near future, meaning it has had the abovementioned number of users in each of the last three financial years. (Art. 3, paras. 1, 2.)

Companies that fulfill these criteria would be obligated to notify the European Commission within three months after the criteria are satisfied. (Art. 3, para. 3.) Furthermore, the Commission would be able to conduct market investigations to identify gatekeepers or “emerging” gatekeepers on a case-by-case basis. (Art. 3, para. 6; arts. 15, 17.) The Commission would also be able to adopt delegated acts to update the obligations for gatekeepers. (Art. 10.)

Obligations for Gatekeepers

Companies designated as gatekeepers would need to fulfill certain obligations with regard to the use of data, interoperability, and self-preferencing. In particular, they would be prohibited from combining personal data from one core platform service with personal data from another service they offer or from third-party services. Furthermore, users would be allowed to uninstall any preinstalled software or app. Third parties would be allowed to access and interoperate with the gatekeeper’s own services. Gatekeepers would also be prohibited from treating their own services and products more favorably in rankings and be obligated to apply fair and nondiscriminatory conditions to such rankings. The DMA would also require providing “advertisers and publishers, upon their request and free of charge, with access to the performance measuring tools of the gatekeeper and the information necessary for advertisers and publishers to carry out their own independent verification of the ad inventory.” (Arts. 5, 6.)

Penalties for Noncompliance

Gatekeepers that do not comply with the main obligations set out in the DMA would be subject to fines not exceeding 10% of their total turnover in the preceding financial year. Noncompliance with ancillary obligations would be subject to fines not exceeding 1% of their total annual turnover. Furthermore, the Commission could assess periodic penalty payments not exceeding 5% of the average daily turnover in the preceding financial year per day. (Arts. 26, 27.) Such decisions of the Commission would be made public. (Art. 34.)

Related Posts