Digital Service Act: What It Means For Companies
The Digital Services Act aims to create a safe digital space. The goal is to protect the rights of users and ensure legal certainty for companies. In this article, we summarise who is affected and what changes will be introduced.
What Are the Aims Of the Digital Service Act?
The provisions of the Digital Service Act have been in force since 17 February 2024 and are intended to enable a balancing act: on the one hand, there is legal certainty for companies – for example, the removal of illegal content should be made easier. At the same time, the law aims to protect important user rights such as freedom of speech. This issupposed to be achieved by:
- New obligations for providers
- Duty of care with regard to illegal user content
- Liability for illegal products and content
The law will also prohibit dark patterns in the future. These are design elements or processes that companies use to manipulate buyers – for example, by repeatedly and aggressively prompting them to take a certain action. This causes them to make purchasing decisions that they would not have made of their own free will.
Who Is Affected By the Law?
The law applies to all digital services that provide goods, services or content to end consumers in the EU. The place of business is irrelevant; the market-place principle applies. These include, among others:
- Internet service providers
- Domain name registrars
- Hosting services such as cloud and web hosting services
- Online marketplaces
- App stores
- Social media platforms
The requirements are particularly strict for platforms that reach at least 45 million users per month in the EU. Their obligations include risk analysis and minimisation, because the risks of illegal content and harm to users are particularly high for them.
What Obligations Will Companies Face?
If a company becomes aware of illegal content, it must take immediate action. Here is an overview of some of its other obligations:
- Transparency and reporting obligations
- Information requirements
- Specifications for the design of services (legal Tec, legal Design)
- Complaint mechanisms
- Remedial procedures for illegal content
The strictness of the rules varies depending on the offer and its size.
Obligations For All Companies
These requirements apply to all affected companies, regardless of their offer and size.
- Notification and Corrective Procedures (Article 3)
Those responsible must set up a reporting and remediation procedure for illegal content.
Companies are obliged to remove illegal content quickly and efficiently. If courts and authorities point out such content, they must react and take action (Articles 9 and 10). Hosting services must establish notification and remediation procedures for reporting this content. When they receive a report, they will follow it up immediately and, if necessary, take the appropriate action (Article 16).
The question of whether content is illegal is something the member states will have to answer individually.
- Liability (Article 7)
Companies are not required to proactively review content for legality. These liability privileges also remain in place if they conduct investigations on their own initiative to ensure compliance with legal requirements. There are exceptions for host providers.
- Legal Representation (Article 13(3))
Some companies direct their offers to users in the EU, but have no branch there themselves. They must appoint a legal representative in the affected state. This representative is the point of contact for users and authorities. In addition, they are liable if the new legal requirements are not met.
The concept of legal representation is not new – it is also used in the General Data Protection Regulation. You can find the most important judgments and risks in our GDPR Fact Check.
- Point of Contact (Articles 11 and 12)
A point of contact must be designated for users, the EU Commission and national authorities. The contact details must be accessible as easily as possible.
- Transparency Obligations (Articles 15, 24, 42)
There are annual reporting requirements for official orders, user complaints and the moderation of content. This also includes the resources used for this purpose, including error rates and protective measures taken.
Anyone offering hosting services must also document how many reports were made by users using the available reporting mechanisms. The companies are also required to report on the measures taken.
Providers of online platforms have reporting obligations in the following areas:
- Complaints received through the internal complaints management system
- The number of disputes submitted to out-of-court dispute resolution and their outcomes
- The number of suspensions or closures of user accounts and their reasons
- The number of average monthly active users in the EU
- Due Diligence Requirements In the General Terms and Conditions (Article 14 (4))
Users must be informed about the mechanisms for moderating content and algorithms in the General Terms and Conditions. This includes guidelines and procedures as well as rules for complaints management. When using and enforcing these mechanisms, companies must respect fundamental rights under European law.
Additional obligations for companies with more than 50 employees and a turnover of at least 10 million euros per year
- Complaints management
- Dispute resolution
- Dealing with abuse of the complaints system
- Design specifications
- Advertising
- Recommendation systems
- Protection of minors
- Information obligation
Who Will Monitor Compliance?
According to the Digital Service Act, compliance with the regulations must be monitored at national level by the individual member states. Those affected can lodge a complaint there. The EU Commission monitors particularly regulated, large companies. A supervisory fee of 0.05 per cent of global annual turnover is charged for this.
In the event of offences, a fine is due. This depends on the amount of the annual turnover and ranges from 1 to 6 per cent
How Can Legal Certainty Be Achieved In the Digital Space?
National jurisdiction will fill the new requirements with practice over time. In addition, responsibilities must be allocated appropriately and companies will be forced to face technical challenges in order to comply with the new regulations.
A structured approach is important to ensure that the changes are successful in the long term. In this way, all parties involved can contribute to the successful implementation of the Digital Service Act and ensure both legal certainty and freedom in the digital space.