Digital Services Act Obligations Overview

The Digital Services Act establishes comprehensive requirements across 93 articles. Below are the key obligations extracted directly from the legislative text.

80
Essential Requirements
6
Operator Obligations
12
Key Obligations

Key Obligations

1

Comply with harmonized rules for intermediary services across the EU internal market

2

Follow the framework for conditional liability exemptions

3

Implement specific due diligence obligations based on service provider category

4

Cooperate with competent authorities for implementation and enforcement

5

Intermediary service providers must comply with DSA requirements if they offer services to EU users

6

Services must determine if they qualify as intermediary services under the DSA definition

7

Providers must ensure compliance while also adhering to other applicable EU regulations listed

8

Service providers must understand and correctly classify their services (mere conduit, caching, hosting, online platform, or search engine)

9

Online platforms must distinguish between commercial and non-commercial content/advertisements

10

Providers must establish whether they have a 'substantial connection to the Union' based on user numbers or targeted activities

11

Content moderation activities must align with the official definition provided

12

Terms and conditions must properly reflect the service type classifications

Essential Requirements

Art 1

Subject matter

This regulation creates unified EU rules for online platforms and services to ensure a safe, trustworthy internet environment. It protects users' fundamental rights while enabling innovation and sets clear responsibilities for different types of online service providers.

Key Requirements:

  • Comply with harmonized rules for intermediary services across the EU internal market
  • Follow the framework for conditional liability exemptions
  • Implement specific due diligence obligations based on service provider category
  • Cooperate with competent authorities for implementation and enforcement

Applies to:

All providers of intermediary services operating within the EU internal market, including hosting services, online platforms, and other digital service providers

Art 2

Scope

The Digital Services Act applies to all online intermediary services (like hosting providers, online platforms, and content sharing services) that operate in the EU, regardless of where the company is based. The Act works alongside existing EU laws on copyright, data protection, consumer protection and other areas, but doesn't apply to services that aren't intermediaries.

Key Requirements:

  • Intermediary service providers must comply with DSA requirements if they offer services to EU users
  • Services must determine if they qualify as intermediary services under the DSA definition
  • Providers must ensure compliance while also adhering to other applicable EU regulations listed

Applies to:

All intermediary service providers offering services to recipients located or established in the EU, regardless of the provider's location

Art 3

Definitions

This article provides the official definitions for all key terms used throughout the Digital Services Act, including what counts as an online platform, illegal content, and various types of internet services. These definitions determine which rules apply to different types of digital service providers and what obligations they must follow.

Key Requirements:

  • Service providers must understand and correctly classify their services (mere conduit, caching, hosting, online platform, or search engine)
  • Online platforms must distinguish between commercial and non-commercial content/advertisements
  • Providers must establish whether they have a 'substantial connection to the Union' based on user numbers or targeted activities
  • Content moderation activities must align with the official definition provided
  • Terms and conditions must properly reflect the service type classifications

Applies to:

All digital service providers operating in the EU, including hosting services, online platforms, search engines, and any intermediary services with EU users or establishments

Art 4

‘Mere conduit’

Article 4 establishes specific obligations and requirements. Review the full text for detailed compliance requirements.

Key Requirements:

  • Where an information society service is provided that consists of the transmission in a communication network of information provided by a recipient of the service, or the provision of access to a communication network, the service provider shall not be liable for the information transmitted or accessed, on condition that the provider: (a) does not initiate the transmission; (b) does not select the receiver of the transmission; and (c) does not select or modify the information contained in the transmission
  • The acts of transmission and of provision of access referred to in paragraph 1 shall include the automatic, intermediate and transient storage of the information transmitted in so far as this takes place for the sole purpose of carrying out the transmission in the communication network, and provided that the information is not stored for any period longer than is reasonably necessary for the transmission
  • This Article shall not affect the possibility for a judicial or administrative authority, in accordance with a Member State’s legal system, to require the service provider to terminate or prevent an infringement

Applies to:

General applicability

Art 5

‘Caching’

This article protects internet services that temporarily store (cache) content to speed up data delivery from being held responsible for illegal content, as long as they don't modify it and remove it quickly when notified that the original has been taken down. Think of it like a delivery service temporarily storing packages - they're not responsible for what's inside as long as they follow proper procedures.

Key Requirements:

  • Do not modify the cached information
  • Comply with access conditions set by the content owner
  • Follow industry-standard rules for updating cached information
  • Don't interfere with legitimate technology used to measure content usage
  • Act expeditiously to remove cached content when notified that the original has been removed or when ordered by authorities

Applies to:

Information society service providers that transmit and temporarily cache information in communication networks (e.g., ISPs, CDN providers, proxy servers)

Art 6

Hosting

Hosting service providers aren't liable for illegal content stored by their users, as long as they don't know about it and remove it quickly once they become aware. However, this protection doesn't apply if the user is under the provider's control or if an online marketplace misleads consumers about who's selling products.

Key Requirements:

  • Remove or disable access to illegal content expeditiously upon obtaining knowledge or awareness
  • Monitor for actual knowledge of illegal activity or content
  • Distinguish between independent users and those acting under provider authority
  • Online platforms must clearly indicate when products/services are from third-party sellers vs the platform itself
  • Comply with judicial or administrative orders to terminate or prevent infringements

Applies to:

Information society services that provide hosting/storage services, particularly online platforms and marketplaces that store user-generated content or enable third-party transactions

Operator Obligations

Art 8

No general monitoring or active fact-finding obligations

Online platforms and hosting services are not required to actively monitor all content on their systems for illegal activity. They don't have to proactively search for problematic content unless specifically ordered to do so.

  • Providers must not implement general monitoring systems
  • No obligation to actively search for illegal content
  • No requirement to investigate facts or circumstances indicating illegal activity
Art 15

Transparency reporting obligations for providers of intermediary services

Online platforms and intermediary services must publish yearly transparency reports detailing their content moderation activities, including how they handle illegal content removal requests from authorities, user complaints, and automated moderation systems. Small and micro enterprises are exempt unless they are very large online platforms.

  • Publish annual transparency reports in machine-readable format
  • Report the number and type of government orders received for content removal
  • Disclose statistics on user notices and trusted flagger reports
Art 24

Transparency reporting obligations for providers of online platforms

Online platforms must publish detailed transparency reports including dispute resolution outcomes, account suspensions, and monthly active user numbers. They must also submit all content moderation decisions to the Commission's public database without personal data.

  • Report dispute resolution statistics including resolution times and implementation rates
  • Report all account suspensions categorized by reason (illegal content, false notices, false complaints)
  • Publish average monthly active users every 6 months using EU methodology
Art 42

Transparency reporting obligations

Very large online platforms and search engines must publish detailed transparency reports every six months covering their content moderation practices, including staffing details and risk assessments. These reports must include specific information about human resources dedicated to content moderation for each EU language, accuracy indicators, and monthly user statistics per Member State.

  • Publish transparency reports within 2 months of designation and then every 6 months
  • Include human resources information for content moderation broken down by EU official languages
  • Report qualifications, linguistic expertise, and training of content moderation staff

Conformity & Enforcement

Art 34

Risk assessment

Very large online platforms and search engines must identify and assess risks their services create for society, including illegal content, harm to fundamental rights, election interference, and public safety threats. They must conduct these risk assessments annually and before launching major new features, examining how their algorithms, content moderation, and advertising systems contribute to these risks.

Next Steps

These requirements are extracted from the official legislative text. For detailed implementation guidance:

Highlights & Comments

No highlights yet. Select text and click "Start Highlighting" to begin.

💬 Need Help with DSA Compliance?

Platform obligations under the Digital Services Act can be nuanced. While our free tools cover the fundamentals, complex moderation policies may need expert review.