Insights | October 2, 2023
Four layers of due diligence in the EU Digital Services Act
The Digital Services Act, an EU regulation, establishes a harmonized and comprehensive regulatory framework for online intermediary services covering a wide range of businesses, such as social media networks, online marketplaces, app stores, cloud and hosting providers, and search engines, whose users are located in the EU. In this article, we discuss the structure of the DSA's four-layered approach to the new due diligence obligations.
Background
The European Union’s new regulation on digital services, known as the Digital Services Act (DSA) is a key aspect of the EU’s Digital Decade Strategy. Adopted to address challenges arising from the rapid growth of large intermediary services and the dissemination of illegal content through these services, the DSA introduces a new set of rules for online intermediary services, while seeking to strike a balance between promoting innovation and protecting users’ fundamental rights, such as freedom of speech, in the digital space. The DSA came into force on 16 November 2022, and most of the new rules, harmonized at EU level, will apply as of 17 February 2024.
The due diligence obligations can be categorized in terms of how intermediary services should be designed in terms of user interface, advertising, and recommender systems; how they may engage in content moderation; and what information should be disclosed to the users, the European Commission, and the national authorities in the interests of transparency. The due diligence obligations that must be complied with under the DSA vary depending on the size and technical function of the intermediary service.
Thus, the due diligence obligations set forth in the DSA are designed in a cumulative four-layered approach. The first level of due diligence obligations applies to all intermediary services, whether they are an internet access provider or one of the largest technology companies.
The most stringent obligations, on the other hand, apply only to very large online platforms and very large search engines. Lastly, non-compliance with the DSA’s obligations may result in fines of up to 6% of a company’s global annual turnover.
Due diligence obligations
All intermediary services – the first level of due diligence
The first level of due diligence applies to all intermediary services, i.e., all services that are involved in some way in the transmission of information in the digital space. The DSA categorizes these services as “mere conduit”, “caching”, and “hosting” services. As a starting point, all intermediary services must designate and make publicly available a single point of contact to ensure that both public authorities and users of the service can communicate with that contact point on matters within the scope of the DSA.
If the intermediary service does not have an establishment in the EU, it must also designate a legal representative in one of the EU Member States where it offers its services. In addition, intermediary services must have user-friendly and clear terms and conditions available explaining how they moderate content on their service, including the use of automated technologies, and must take due account of the rights and interests of all stakeholders when enforcing these terms and conditions. Lastly, with the exception of small and micro-enterprises, intermediary services must publish annual transparency reports containing information on their content moderation practices.
Hosting services – the second level of due diligence
The second level of due diligence applies to all hosting services (including online platforms), i.e., all intermediary services that store user information on their service at a user’s request. Hosting services are subject to certain procedural obligations regarding the user interface of their content moderation processes.
In particular, these services must utilize a “notice and takedown” mechanism that allows users of the service to report allegedly illegal content to the hosting service. In addition to promptly reviewing the notices and deciding whether or not to remove the content in question, the hosting service must provide the user with a statement of reason explaining the action taken by the service with respect to the content and the grounds for such action.
Online platforms – the third level of due diligence
The third level of due diligence applies to online platforms that not only store users’ information, but also distribute it to other users (i.e., the recipients of the information) of the service. The most notable example of such platforms is social media networks.
However, small and micro enterprises, i.e., companies with fewer than 50 employees or a turnover or balance sheet total of less than EUR 10 million, do not fall within the scope of this level of due diligence. A large part of the DSA’s due diligence obligations concern these online platforms.
Online platforms must not only have a notice and takedown mechanism in place, but also allow users to complain about a decision made by the platform, e.g. concerning the removal of content or suspension of an account, either to the platform’s internal complaint-handling mechanism or, as a next step, to an independent out-of-court dispute settlement mechanism that can make a non-binding decision on the content or account in question.
When handling notices on allegedly illegal content, online platforms must give priority to notices received from so-called “trusted flaggers”, i.e., associations certified to make accurate notices. Further, online platforms must suspend accounts of users who frequently and manifestly report unfounded illegal content or disseminate manifestly illegal content. The actions taken with respect to user content explained above must be included in the online platform’s annual transparency report.
Online platforms must design their platform in such a way that it does not contain “dark patterns”, which means that they must not deceive, manipulate, or otherwise mislead users from making free and informed decisions. Further, advertisements should be clearly marked as such and contain information on the advertiser. Online platforms should also ensure a high level of privacy and security to protect minors on their platform.
In addition, online platforms are prohibited from using personal data of minors or special category data to present targeted advertisements. Users should receive clear information about recommender systems used on the platform and how these systems can be modified.
Lastly, online platforms that qualify as online marketplaces are subject to a “know your business customer” obligation, which means that they must verify the identity of traders who use their platform to conclude distance contracts with consumers.
VLOPs and VLOSEs – the fourth level of due diligence
The fourth and final level of due diligence applies to so-called very large online platforms (VLOPs) or very large online search engines (VLOSEs). The criterion that must be met for an intermediary service provider to be classified as a VLOP or VLOSE is that the platform has an average of 45 million active monthly users in the EU.
The providers of online platforms and online search engines started to implement their transparency reporting obligations when the DSA came in force, as they were required to publish information on their average monthly active recipients by 17 February 2023 (Article 24(2) and (3)). Further, in April 2023, the European Commission announced its first decision to designate platforms qualifying as VLOPs and VLOSEs based on the monthly active users. This group includes large technology platforms such as Amazon Store, Facebook, TikTok, YouTube, and Google Search.
VLOPs and VLOSEs must conduct systemic risk assessments on illegal content disseminated on their platforms and on serious negative effects or consequences for users on their platforms and take effective risk mitigation measures.
Risk mitigation measures include appointing a compliance officer, providing data access to supervisory authorities and vetted researchers, producing risk reports, having independent auditors verify their compliance with the DSA, and complying with the European Commission’s crisis response mechanism. The DSA provides that the European Commission is the supervisory authority for VLOPs and VLOSEs, which means that it has broad powers and oversight over these online service providers.
Final remarks
In conclusion, the due diligence obligations presented in this article introduce significant changes to the existing regulatory framework for online intermediary services. For this reason, it is important to prepare for and assess potential business risks and opportunities that arise from these new obligations.
As a first step, companies should determine which layer of due diligence the company is subject to. Based on this, an assessment can begin of how the company’s service-specific operations need to be adjusted.
At the operational level, the preparations highlighted in the assessment should be addressed in a timely manner, as most of the DSA’s due diligence obligations will come into force within less than six months, namely on 17 February 2024. This is particularly important as failure to comply with an obligation can result in fines of up to 6% of a company’s global annual turnover.
Finally, new obligations are not always a threat or a burden, but a genuine pathway to develop new business models. “DSA compliance” can therefore, e.g., increase companies’ competitiveness and strengthen their goodwill.
Roschier continues to follow the development of the coming EU data legislation closely and advise our clients on compliance and related issues. We are happy to help you with any questions you may have regarding the potential impact on your business or in general.
Article written by Partner Johanna Lilja, Associate Anton Odabasi and Associate Trainee Iris Laitinen.