Fully in Effect Since Feb 17, 2024 - Compliance Required

Digital Services Act (DSA)

If you operate online platforms, marketplaces, social media, hosting services, or search engines serving EU users, DSA requires transparent content moderation, user protection measures, and systemic risk management for the largest platforms.

45M users
VLOP threshold
6% turnover
Maximum fine
1+ years
Fully operational

DSA in Plain English

The Digital Services Act is the EU's law for online platforms and digital services. It says if you run a website, app, or service where users can post content or interact, you must moderate illegal content, be transparent about how you operate, and protect users' fundamental rights.

Applies to all online platforms serving EU users
Requires clear content policies and moderation
Users have rights to appeal content decisions
Largest platforms (45M+ users) have extra obligations
Must provide annual transparency reports
Algorithm transparency and user control required

Your 8-Step Path to DSA Compliance

Follow these steps to achieve full compliance. Each step builds on the previous one, creating a comprehensive compliance program.

Step 1 Beginner

1. Determine Your Service Category

Identify which type of digital service you provide and what obligations apply

Key Actions

  • Check if you're an intermediary service (hosting, caching, mere conduit)
  • Determine if you qualify as an online platform
  • Calculate monthly active users in the EU
  • Check if you're a Very Large Online Platform (45M+ EU users)

Available Tools

Service Classifier Tool User Calculator Platform Assessment

Real Examples

Social media = online platform Cloud hosting = intermediary Search engine with 50M users = VLOP
Timeline: 1-2 weeks
Learn More
Step 2 Intermediate

2. Set Up Notice-and-Action System

Create mechanisms for users to report illegal content and for you to respond

Key Actions

  • Implement easy-to-use reporting mechanisms
  • Create clear content policies and community guidelines
  • Set up processes to review and act on reports
  • Establish appeals and counter-notification procedures

Available Tools

Reporting System Template Content Policy Builder Moderation Workflow

Real Examples

Report button on posts Automated content scanning Human review process
Timeline: 4-8 weeks
Learn More
Step 3 Intermediate

3. Implement Transparency Requirements

Provide clear information about your content moderation and recommendation systems

Key Actions

  • Create transparency reports on content moderation
  • Publish clear terms of service
  • Explain how recommendation algorithms work
  • Document content moderation decisions with statements of reasons

Available Tools

Transparency Report Generator ToS Template Algorithm Explainer

Real Examples

Monthly moderation statistics Algorithm transparency page Detailed removal notices
Timeline: 3-6 weeks
Learn More
Step 4 Advanced

4. Conduct Systemic Risk Assessment (VLOPs only)

Identify and assess risks your platform poses to society and fundamental rights

Key Actions

  • Assess risks to fundamental rights and freedoms
  • Evaluate spread of illegal content and disinformation
  • Analyze impact on democratic processes and public security
  • Review effects on protection of minors and mental health

Available Tools

Risk Assessment Framework Impact Analysis Tool Rights Assessment

Real Examples

Disinformation spread analysis Echo chamber assessment Child safety evaluation
Timeline: 8-12 weeks
Learn More
Step 5 Advanced

5. Implement Risk Mitigation Measures (VLOPs only)

Put in place measures to address identified systemic risks

Key Actions

  • Modify recommendation systems to reduce harmful content amplification
  • Implement stronger content moderation for high-risk content
  • Add safeguards for democratic processes during elections
  • Enhance protections for minors and vulnerable users

Available Tools

Algorithm Adjustment Tools Child Safety Kit Election Protection Guide

Real Examples

Reduced amplification of disputed content Age-appropriate design Election period restrictions
Timeline: 12-24 weeks
Learn More
Step 6 Advanced

6. Undergo External Auditing (VLOPs only)

Have independent auditors review your risk assessments and mitigation measures

Key Actions

  • Select qualified independent auditors
  • Provide access to necessary data and documentation
  • Address audit findings and recommendations
  • Submit audit reports to the European Commission

Available Tools

Auditor Directory Audit Preparation Checklist Compliance Tracker

Real Examples

Annual third-party security audit Algorithm fairness review Content moderation effectiveness assessment
Timeline: 6-8 weeks
Learn More
Step 7 Advanced

7. Provide Data Access to Researchers (VLOPs only)

Grant qualified researchers access to platform data for studying systemic risks

Key Actions

  • Establish data access procedures for qualified researchers
  • Implement privacy-preserving data sharing mechanisms
  • Respond to research data requests within specified timeframes
  • Maintain records of data access and research activities

Available Tools

Researcher Portal Data Access API Privacy Protection Tools

Real Examples

Anonymized user interaction data Content recommendation datasets Moderation decision logs
Timeline: 4-6 weeks
Learn More
Step 8 Intermediate

8. Maintain Ongoing Compliance

Continuously monitor and improve your DSA compliance program

Key Actions

  • Monitor content moderation effectiveness
  • Update risk assessments regularly
  • Submit required transparency reports
  • Respond to regulatory requests and investigations

Available Tools

Compliance Dashboard Monitoring Systems Reporting Automation

Real Examples

Quarterly transparency reports Annual risk reassessment Regulatory correspondence tracking
Timeline: Ongoing
Learn More

What DSA Actually Requires You to Do

Content Moderation

Remove illegal content and provide clear policies

  • • Notice-and-action mechanisms
  • • Clear community guidelines
  • • Statements of reasons for decisions

Transparency Reporting

Publish regular reports on content moderation activities

  • • Annual transparency reports
  • • Content removal statistics
  • • Automated moderation accuracy

Risk Management (VLOPs)

Assess and mitigate systemic risks to society

  • • Annual risk assessments
  • • Mitigation measures
  • • Independent auditing

User Protection

Protect users, especially minors, from harmful content

  • • Age verification systems
  • • Parental controls
  • • Mental health safeguards

Data Access (VLOPs)

Provide data access to researchers and authorities

  • • Researcher data sharing
  • • Regulatory cooperation
  • • Privacy-preserving methods

Algorithm Accountability

Explain how recommendation systems work

  • • Algorithm transparency
  • • User control options
  • • Bias assessment and mitigation

Common DSA Questions

What types of services does DSA cover?

DSA has a tiered approach covering different types of digital services:

  • Intermediary services: Hosting, caching, internet access providers
  • Online platforms: Social media, marketplaces, app stores, travel booking
  • Very Large Online Platforms (VLOPs): Platforms with 45M+ monthly EU users
  • Search engines: Including Very Large Online Search Engines (VLOSEs)

What counts as illegal content under DSA?

DSA doesn't define illegal content - it refers to existing EU and national laws:

  • Content that violates EU criminal law (terrorism, child exploitation)
  • Content that violates national laws (hate speech, defamation)
  • Intellectual property violations (copyright infringement)
  • Consumer protection violations (fraudulent products, false advertising)

What are the main differences between platform tiers?

DSA obligations increase based on platform size and impact:

  • All platforms: Notice-and-action, transparency, illegal content removal
  • Online platforms: Additional user protection measures, appeals processes
  • VLOPs/VLOSEs: Risk assessments, mitigation measures, external audits, researcher access

How does DSA affect recommendation algorithms?

DSA requires transparency and user control over algorithmic systems:

  • Platforms must explain how recommendations work in plain language
  • Users must have options to modify or disable personalized recommendations
  • VLOPs must assess risks from their algorithmic amplification
  • Algorithm design changes may be required to mitigate identified risks

What about freedom of expression concerns?

DSA aims to balance content moderation with fundamental rights protection:

  • Platforms must consider fundamental rights in all moderation decisions
  • Users have right to appeal content moderation decisions
  • Independent dispute resolution mechanisms required
  • Over-removal of legal content should be avoided

How does DSA enforcement work?

DSA enforcement varies based on platform size:

  • National authorities supervise smaller platforms and services
  • European Commission directly supervises VLOPs and VLOSEs
  • Fines up to 6% of global annual turnover for violations
  • Repeated violations can lead to EU market bans

Ready to Get DSA Compliant?

DSA has been fully operational since February 2024. If you serve EU users, compliance is mandatory now.

Article Article 14 ·
View on EUR-Lex

🤝 Still Feeling Overwhelmed?

EU cybersecurity laws can be complex. Our free tools and guides work great for most people, but if you're dealing with something particularly challenging or have tight deadlines, we're here to help.