Digital Service Act: What It Means For Companies

The Digital Services Act aims to create a safe digital space. The goal is to protect the rights of users and ensure legal certainty for companies. In this article, we summarise who is affected and what changes will be introduced.

What Are the Aims Of the Digital Service Act?

The provisions of the Digital Service Act have been in force since 17 February 2024 and are intended to enable a balancing act: on the one hand, there is legal certainty for companies – for example, the removal of illegal content should be made easier. At the same time, the law aims to protect important user rights such as freedom of speech. This issupposed to be achieved by:

The law will also prohibit dark patterns in the future. These are design elements or processes that companies use to manipulate buyers – for example, by repeatedly and aggressively prompting them to take a certain action. This causes them to make purchasing decisions that they would not have made of their own free will.


 

Who Is Affected By the Law?

The law applies to all digital services that provide goods, services or content to end consumers in the EU. The place of business is irrelevant; the market-place principle applies. These include, among others:

The requirements are particularly strict for platforms that reach at least 45 million users per month in the EU. Their obligations include risk analysis and minimisation, because the risks of illegal content and harm to users are particularly high for them.

What Obligations Will Companies Face?

If a company becomes aware of illegal content, it must take immediate action. Here is an overview of some of its other obligations:

The strictness of the rules varies depending on the offer and its size.

Obligations For All Companies

These requirements apply to all affected companies, regardless of their offer and size.

Those responsible must set up a reporting and remediation procedure for illegal content.

Companies are obliged to remove illegal content quickly and efficiently. If courts and authorities point out such content, they must react and take action (Articles 9 and 10). Hosting services must establish notification and remediation procedures for reporting this content. When they receive a report, they will follow it up immediately and, if necessary, take the appropriate action (Article 16).

The question of whether content is illegal is something the member states will have to answer individually.

Companies are not required to proactively review content for legality. These liability privileges also remain in place if they conduct investigations on their own initiative to ensure compliance with legal requirements. There are exceptions for host providers.

Some companies direct their offers to users in the EU, but have no branch there themselves. They must appoint a legal representative in the affected state. This representative is the point of contact for users and authorities. In addition, they are liable if the new legal requirements are not met.

The concept of legal representation is not new – it is also used in the General Data Protection Regulation. You can find the most important judgments and risks in our GDPR Fact Check.

A point of contact must be designated for users, the EU Commission and national authorities. The contact details must be accessible as easily as possible.

There are annual reporting requirements for official orders, user complaints and the moderation of content. This also includes the resources used for this purpose, including error rates and protective measures taken.

Anyone offering hosting services must also document how many reports were made by users using the available reporting mechanisms. The companies are also required to report on the measures taken.

Providers of online platforms have reporting obligations in the following areas:

Users must be informed about the mechanisms for moderating content and algorithms in the General Terms and Conditions. This includes guidelines and procedures as well as rules for complaints management. When using and enforcing these mechanisms, companies must respect fundamental rights under European law.

Additional obligations for companies with more than 50 employees and a turnover of at least 10 million euros per year

Who Will Monitor Compliance?

According to the Digital Service Act, compliance with the regulations must be monitored at national level by the individual member states. Those affected can lodge a complaint there. The EU Commission monitors particularly regulated, large companies. A supervisory fee of 0.05 per cent of global annual turnover is charged for this.

In the event of offences, a fine is due. This depends on the amount of the annual turnover and ranges from 1 to 6 per cent

How Can Legal Certainty Be Achieved In the Digital Space?

National jurisdiction will fill the new requirements with practice over time. In addition, responsibilities must be allocated appropriately and companies will be forced to face technical challenges in order to comply with the new regulations.

A structured approach is important to ensure that the changes are successful in the long term. In this way, all parties involved can contribute to the successful implementation of the Digital Service Act and ensure both legal certainty and freedom in the digital space.