Insights | April 5, 2023
Introduction to the European Union Digital Decade Strategy
In efforts to implement the ambitious European Union Digital Decade Strategy, the EU is in the process of introducing an avalanche of new Regulations and Directives, which can be roughly divided into five main themes: platforms and digital services, data, personal data, cyber security, and artificial intelligence. We provide brief introductions to these themes below.
Other means to achieve the goals of the European Union Digital Decade Strategy – i.e., to strengthen the digital sovereignty of the EU and to set standards with a clear focus on data, technology, and infrastructure – by 2030 include a range of investments, policy initiatives and other similar devices.
As the long-awaited EU Digital Decade Strategy seems to be finally transforming from vision into substance, we consider this to be an opportune time to summarize the new upcoming regulatory framework. This article is Part 1 of a new Insights article series about the new Data and Digitalization Regulation of the European Union.
Platforms and digital services
By introducing a more robust legal framework to govern platforms and digital services, the EU aims to create a safer digital space in which the fundamental rights of all users of digital services are protected and to establish a level playing field to foster innovation, growth, and competitiveness, both in the European Single Market and globally. The main pieces of new legislation consist of the following:
- Digital Services Act (EU) 2022/2065 (DSA), which seeks to ensure a safe and trustworthy online environment that protects individual consumers from illegal content. In particular, the DSA imposes due diligence and risk management obligations on intermediary services, in relation to the information and content that such intermediary services publish. The scope of the DSA is subject to certain variations depending on whether the regulated company is a so-called “very large online platform”, “very large online search engine”, or smaller service with fewer active monthly users.
- Digital Markets Act (EU) 2022/1925 (DMA), which imposes a set of ex ante obligations on very large operators in the digital services industry. So-called ‘gatekeepers’, such as Google or Meta, must adhere to a set of new obligations as they are considered to provide core platform services and to hold a significant market position in the European Single Market. DMA aims to mitigate situations in which an abuse of dominant position may take place, and, thus, to ensure a fair regulatory framework for businesses that are using core platform services provided by the gatekeepers.
- Platform-to-Business (P2B) Regulation (EU) 2019/1150 aimed to promote fairness and transparency for business users of online intermediation services (P2B Regulation).
- Digital Copyright Directive (EU) 2019/790 intended to ensure a well-functioning marketplace for copyright by regulating certain matters concerning, e.g., content-sharing platforms and data mining.
Data
One of the key aspects of the EU Digital Decade Strategy is data. Data has been recognized by the EU as an essential resource for economic growth, competitiveness as well as general societal progress. The EU’s aim is to be the leading data-driven society and to create a ‘single market’ for data.
To date, EU legislation pertaining to data has been heavily focused on regulating personal data and its protection. Ownership rights as well as the rights to share and use non-personal data have remained within the contractual freedom of companies. Also, individuals and entities have not had control over or access to non-personal data held by companies. The EU has now issued legislation to govern non-personal data, creating a governance framework for both personal and non-personal data and substantive rules on sharing of and access to such data, namely the following:
- Regulation on European Data Governance (EU) 2018/1724, i.e., the Data Governance Act or the DGA, creates a framework for governance of data within the EU by promoting availability of data and creating unified rules for sharing personal and non-personal data.
- Proposal for a Regulation on harmonized rules on fair access and use of data, i.e., the Proposal for Data Act or the DA, provides new substantive rules and obligations for accessing and sharing both personal and non-personal data generated from connected devices and data focused services across sectors. For example, the manufacturers of Internet-of-Things (or ‘IoT’) products and providers of related services would have an obligation to make non-personal data accessible to users (which may be an individual or a legal entity) as well as share the data with third-party service providers (only where approved by the user) as well as with public bodies in exceptional circumstances.
Personal data
The legal framework of personal data and protection of such data have already been subject to reforms in the EU during the past decade, culminating in the introduction of the EU General Data Protection Regulation (EU) 2016/679 (GDPR) in 2016. However, the work is still ongoing:
- With respect to the GDPR, on 24 February 2023, the European Commission published a call for evidence regarding harmonizing aspects of the administrative procedure applied by the supervisory authorities in the enforcement of the GDPR. The purpose of the initiative is to tackle the issues that have arisen in practice from cross-border enforcement cases.
- Proposal for a Regulation on privacy and electronic communications, i.e., the proposal for ePrivacy Regulation, was issued already on 10 January 2017 but has been stuck in political debate ever since. The ePrivacy Regulation aims to reform and harmonize the rules on digital privacy and electronic communication in the EU, including rules on cookies and direct marketing.
Cybersecurity
By introducing new cybersecurity Regulations and Directives, the EU aims to build resilience to cyber threats, increase operational capacity to prevent, deter and respond to cyber threats, and promote cooperation in the global cyberspace. Furthermore, the EU aims to ensure that service providers providing so-called critical and essential services take action to increase their cybersecurity level and implement preventive measures in order to achieve sufficient levels of protection. By creating a safer cyberspace, the EU also aims to increase people’s trust in technology. The main pieces of new legislation consist of the following:
- Directive (EU) 2022/2555 on measures for a high common level of cybersecurity across the Union (NIS II Directive), which will be applicable for operators providing services related to, e.g., energy, transport, banking, digital infrastructure, certain digital services, and for service providers manufacturing medical devices and electronic products.
- Regulation (EU) 2019/881 on ENISA (the European Union Agency for Cybersecurity) and on information and communications technology cybersecurity certification (Cybersecurity Act).
- Proposal for a Regulation on horizontal cybersecurity requirements for products with digital elements (Cyber Resilience Act).
- Proposal for a Directive on the resilience of critical entities (CER Directive).
Artificial intelligence (AI)
The European Union is pursuing a comprehensive approach to regulating artificial intelligence. The proposed AI regulatory package aims to balance the benefits of innovation and economic growth with the potential risks to fundamental rights, ethical and legal principles, and the shared values of the EU.
The package includes a set of technology-neutral rules for the development, deployment, and use of AI technologies, including a ban on certain AI applications, mandatory requirements for transparency, accountability, and human oversight, and a system of conformity assessments and certification. The approach emphasizes ethical considerations and potential risks, which involves both assessing the potential harm that AI systems could cause and implementing measures to prevent and mitigate such risks. The EU proposes a risk-based approach to regulating AI, with higher risk applications subject to more stringent requirements. These requirements may include mandatory testing and certification, as well as the use of independent third-party auditors to assess compliance with ethical principles.
The EU has identified high-risk applications of AI systems to include biometric identification and categorization of natural persons (in particular age, ethnicity, sex or disabilities), critical infrastructure (management and operation of road traffic and the supply of water, gas, heating and electricity), employment, and certain essential private and public services and benefits.
Further, the framework introduces rules on evidence and causation to facilitate civil claims for damages suffered by end-users of AI-related products and services.
The main pieces of new legislation consist of the following:
- Proposal for a Regulation laying down harmonized rules on artificial intelligence (Artificial Intelligence Act or AI Act).
- Proposal for a Directive on adapting non-contractual civil liability rules to artificial intelligence (AI Liability Directive).
**
We will dig deeper into these topics in subsequent Roschier Insights articles and welcome our network to learn from and share these topics at an event later this year (more information to follow).
We look forward to engaging in discussions concerning all thoughts, questions and ideas that arise regarding this new data and digitalization regulation.