12
 min read

Digital Services Act (DSA): What Compliance Officers Need to Know

EU's Digital Services Act sets strict rules for online safety, transparency, and accountability. Learn key obligations for compliance officers.
Digital Services Act (DSA): What Compliance Officers Need to Know
Published on
August 15, 2025
Category
Compliance

Setting New Rules for the Digital World

The digital landscape has long outpaced regulations, leading to growing concerns about illegal content, disinformation, and user safety on online platforms. In response, the European Union introduced the Digital Services Act (DSA), a sweeping law that raises the bar for accountability on the internet. From social media giants to e-commerce sites, organizations of all sizes and industries are now rethinking how they monitor and manage content. 

What is the Digital Services Act?

The Digital Services Act is a landmark EU regulation aimed at creating a safer and more transparent online environment. Enacted in 2022 as part of a broader digital regulatory package alongside the Digital Markets Act, the DSA modernizes the EU’s e-commerce rules after two decades[1]. It establishes new accountability standards for online intermediaries (like internet service providers), hosting services (such as cloud and web hosting), and especially online platforms that host user content. At its core, the DSA seeks to protect users from illegal content and products, increase transparency in how platforms operate, and safeguard fundamental rights in the digital sphere[1].

The DSA was driven by mounting public pressure to address issues like hate speech, counterfeit goods, data misuse, and the spread of disinformation online. High-profile incidents underscored gaps in the old rules, strengthening the call for reform. By introducing clearer responsibilities for digital services, the DSA aims to restore trust and accountability. Importantly, it does so without dismantling the “safe harbor” principle that has governed the internet, platforms aren’t automatically liable for every user post, as long as they act responsibly when issues are flagged[2]. In essence, the DSA preserves that protection for intermediaries but layers on active duties to curb known harms.

Who Needs to Comply with the DSA?

One key aspect of the DSA is its broad scope. The law applies to virtually all “digital services” that connect consumers to goods, services, or content. This means it doesn’t just target Big Tech social networks, companies across industries may fall under the DSA if they offer online platforms or intermediary services. The DSA defines several categories of service providers:

  • Intermediary services: Basic internet infrastructure like ISPs and domain registries.
  • Hosting services: Entities that store information for users, including cloud storage and web hosting providers.
  • Online platforms: Services that disseminate user content to the public, such as social media, video-sharing sites, online marketplaces, app stores, and sharing economy platforms. Most obligations in the DSA kick in at this level.
  • Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs): A special tier for the biggest players with over 45 million EU users (around 10% of the EU population). These have extra stringent rules due to their outsized impact[3].

If your company operates a website or app that allows users to post, share, or trade content, from a global social network to a niche forum or e-commerce site, you likely have responsibilities under the DSA. Even organizations outside the tech sector (for example, a retailer with an online marketplace or a media site that allows user comments) must assess whether the DSA applies. Non-EU companies are not exempt: if you serve EU users, you must comply and appoint an EU-based legal representative for DSA purposes[2]. Compliance isn’t just a concern for Silicon Valley giants, but for any business with an online presence.

Key Obligations Under the DSA

The DSA introduces a range of obligations, scaled according to the size and role of the service provider. Compliance officers should familiarize themselves with these requirements, as they will form the backbone of any DSA compliance program. Some of the cornerstone obligations include:

  • Removal of illegal content: All platforms must act swiftly to remove or disable access to content that is illegal in the EU once they are made aware of it[2]. This covers things like hate speech, terrorist propaganda, child sexual abuse material, fraud scams, or illicit goods. The DSA standardizes notice-and-takedown procedures, when users or authorities flag unlawful content, platforms need effective mechanisms to review and remove it expeditiously.
  • Transparency and reporting: Platforms must be transparent about their content moderation policies and practices. They are required to publish regular transparency reports detailing the number of content removal orders, user notices, complaints, and actions taken[3]. Additionally, terms of service must be clear and consistently enforced.
  • User rights and redress: The DSA bolsters users’ rights by mandating easy-to-use complaint systems. If a user's content is taken down or their account is suspended, they must be told why and have the ability to appeal. Platforms need to offer internal complaint handling and also out-of-court dispute resolution options for users to challenge content decisions[2]. This ensures fairness and prevents the over-removal of lawful content.
  • Trusted flaggers: To improve the speed and accuracy of moderation, the DSA introduces “trusted flaggers.” These are vetted organizations or experts with a proven track record of identifying illegal content reliably. Platforms must give reports from trusted flaggers priority in their review queues[2].
  • Advertising transparency: Online platforms must disclose when users are being shown an advertisement and who paid for it. Users should be able to tell at a glance that a post or listing is sponsored. Critically, certain types of targeted advertising are now restricted, for example, platforms can no longer target ads to children or use sensitive personal data (like religion or political beliefs) for ad targeting under the DSA[4]. Compliance teams will need to work closely with marketing to ensure advertising practices meet these new standards.
  • Risk assessment and audits (for VLOPs/VLOSEs): The very largest platforms and search engines have additional duties reflecting their systemic importance. They must perform annual risk assessments to identify how their systems might contribute to the spread of illegal or harmful content, impacts on fundamental rights, public health, or other societal risks[3]. They then must implement reasonable risk mitigation measures (such as adjusting algorithms or adding more moderators) and submit to independent audits of their DSA compliance each year[3]. These audits, conducted by qualified external firms, will scrutinize whether the platform is living up to its obligations.

For most companies, many of these obligations will translate into new internal procedures, documentation, and possibly technical tools. Even smaller online platforms (though exempt from some heavy duties like formal audits) should establish clear processes for content moderation, user notices and appeals, and transparency reporting. The overarching goal is that every actor in the digital ecosystem takes responsibility for a safer online environment.

Implications for Businesses and Compliance Officers

Implementing the DSA’s requirements is not just a legal box-ticking exercise, it has wide-ranging implications for how businesses operate online. Compliance officers will need to coordinate across departments (IT, legal, security, HR, etc.) to ensure all aspects of the law are addressed. Key considerations include:

  • Policy updates and training: Update terms of service and content policies to align with DSA standards, and make sure they are clearly written. Provide training for employees, especially moderators, customer support, and anyone reviewing user content, on the new rules and how to enforce them consistently and fairly. HR and leadership should also cultivate a culture that prioritizes user safety and legal compliance, so staff understand why these changes matter.
  • Investments in moderation tools and staff: Many businesses will need to invest in better content moderation systems. This could mean hiring additional moderators or deploying improved AI filters to detect illegal material (with human oversight to avoid errors). Smaller platforms that lacked formal notice-and-action systems will now need a user-friendly channel for reporting content and an internal workflow to handle those reports efficiently. 
  • Cross-functional compliance efforts: The DSA’s multifaceted obligations (content takedown, data sharing, advertising rules, etc.) mean siloed efforts won’t suffice. Companies should establish a cross-functional team or task force, including compliance officers, legal counsel, data privacy officers, security leads, and business unit managers, to perform a DSA readiness assessment and oversee implementation. It’s also wise to designate points of contact for regulatory inquiries or user complaints, given the DSA’s emphasis on responsiveness.
  • Impact on business models: Some firms may need to adjust how they operate. For instance, online marketplaces must pay closer attention to vetting third-party sellers and products to keep illegal goods off their platforms. Social networks might tweak their algorithms or features to reduce the spread of harmful content as part of risk mitigation. Advertising-driven platforms that relied heavily on personalized ads will have to adapt to the new targeting limits (such as not profiling minors or using sensitive data)[4]. Business owners should recognize that while these changes can entail costs, non-compliance risks hefty fines and reputational damage, whereas proactively meeting DSA standards can be a trust-building advantage.

Enforcement, Penalties, and Timeline

The DSA is now in force, which means regulators have moved from preparation to active enforcement. Understanding the timeline and enforcement mechanisms is crucial:

  • Timeline: The Digital Services Act officially entered into force in November 2022, but its requirements kicked in after a transition period. For most online platforms, the rules started applying from 17 February 2024[1]. The largest platforms were on a faster schedule, in mid-2023 the European Commission designated 19 services as “very large” platforms or search engines, and those companies had to comply by 25 August 2023[3]. Now in 2025, we are fully in the enforcement phase: companies should already have their DSA compliance measures in place.
  • Enforcement authorities: Oversight is a shared responsibility under the DSA. Each EU member state has a Digital Services Coordinator, a regulator charged with supervising providers (especially smaller domestic platforms) at the national level. For the very large platforms and search engines operating across Europe, the European Commission itself acts as the primary regulator[3]. Companies can expect both national and EU-level authorities to monitor compliance and investigate complaints.
  • Penalties for non-compliance: The DSA comes with serious penalties. Regulators can impose fines of up to 6% of a company’s global annual turnover for major violations[4]. To put that in perspective, for a tech giant this could mean billions of euros, a deterrent on par with the hefty fines seen under GDPR. In extreme cases, if a platform consistently flouts the rules and refuses to correct issues, authorities could even seek to temporarily suspend its service in the EU. Compliance officers should treat DSA obligations with the same gravity as data protection or financial regulations, given the potential business impact of breaches.
  • Early enforcement and challenges: Although enforcement is just ramping up, we’ve already seen some activity. In one notable case, a European online retailer challenged its designation as a Very Large Online Platform under the DSA, arguing it should not be subject to the extra requirements[5]. This shows that the interpretation of the law is still evolving, and some companies are testing the boundaries.

The bottom line is that the grace period is over. Organizations that took a wait-and-see approach are now at risk, as authorities have new powers and a public mandate to ensure the digital marketplace is safer and more accountable.

Final thoughts: Embracing a Safer Digital Future

The Digital Services Act represents a major shift in how online businesses are regulated. Adapting to the DSA may require significant effort, but it’s also an opportunity. By embracing the spirit of the DSA, organizations can build trust with their users and stakeholders. Compliance officers and business leaders should approach the DSA not just as a box-ticking compliance exercise, but as a framework for better digital citizenship. This means going beyond the minimum, regularly evaluating how your services impact society, staying ahead of emerging risks, and engaging openly with regulators and industry peers to share best practices.

While the DSA is an EU law, its influence is global. It sets a precedent that other regions may follow in holding online services to higher standards. Companies operating internationally would do well to align with these principles early, as robust content governance is becoming a norm. In the end, those who proactively adapt to the DSA will not only avoid penalties but also help shape a safer and more trustworthy digital future for everyone.

FAQ

What is the main goal of the Digital Services Act (DSA)?

The DSA aims to create a safer, more transparent online environment by holding digital service providers accountable for illegal content, protecting user rights, and ensuring fair business practices.

Who must comply with the DSA?

The DSA applies to nearly all digital services connecting consumers to goods, services, or content, including social media, marketplaces, hosting services, and even non-EU companies serving EU users.

What are the key obligations under the DSA?

Key obligations include swift removal of illegal content, transparent moderation policies, user complaint systems, advertising transparency, and additional risk assessment duties for the largest platforms.

How is the DSA enforced?

Enforcement is handled by national Digital Services Coordinators for smaller platforms and by the European Commission for very large platforms. Violations can result in fines up to 6% of global annual turnover.

When did the DSA rules take effect?

The DSA took effect on 17 February 2024 for most platforms, while very large platforms had to comply by 25 August 2023. Enforcement is now fully active.

References

  1. European Commission. The Digital Services Act: Ensuring a Safe and Accountable Online Environment. https://op.europa.eu/en/publication-detail/-/publication/877c2a1b-248b-11ef-a195-01aa75ed71a1?
  2. European Commission. Digital Services Act: List of designated very large platforms and search engines. https://ec.europa.eu/commission/presscorner/detail/en/QANDA_23_2081
  3. Wired. How Europe’s New Digital Services Act Will Change the Internet. https://www.wired.com/story/digital-services-act-regulation/?
Weekly Learning Highlights
Get the latest articles, expert tips, and exclusive updates in your inbox every week. No spam, just valuable learning and development resources.
By subscribing, you consent to receive marketing communications from TechClass. Learn more in our privacy policy.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.