Digital Services Act (DSA) Risk Assessments

Digital Services Act (DSA) Risk Assessments

It’s been a busy time for European lawmakers in Brussels as a tranche of new digital regulation comes into force. This post will focus on the Digital Services Act, specifically its provisions on rights assessments and how these may play out in practice. But first a bit of background on recent legislative developments from the EU.

According to the European Commission, the Digital Services Act (DSA) and the Digital Market Act (DMA) form a single set of rules that apply across the whole EU which have two main goals: 1) to create a safer digital space in which the fundamental rights of all users of digital services are protected; and 2) to establish a level playing field to foster innovation, growth, and competitiveness, both in the European Single Market and globally. Digital services include a large category of online services, from simple websites to internet infrastructure services and online platforms. The rules specified in the DSA primarily concern online intermediaries and platforms. For example, online marketplaces, social networks, content-sharing platforms, app stores, and online travel and accommodation platforms. 

The Digital Services Act (DSA) regulates the obligations of digital services, including marketplaces, that act as intermediaries in their role of connecting consumers with goods, services, and content. 

The DSA described by some as the “Constitution of the Internet”, became fully applicable across the EU on 17 February 2024.The DSA is designed to provide greater online safety by ensuring that what is illegal offline is also illegal online. 

The DSA is an EU Regulation and therefore has direct effect across all EU Member States. However, Ireland was required to introduce legislation to provide for the implementation of supervision and enforcement provisions of the DSA. The Digital Services Act 2024 (“the 2024 Act”) was signed into law by President Higgins on 11 February 2024. 

The DSA sets out a cumulative set of rules with stricter and more onerous rules for bigger, more powerful entities (called very large online platforms (VLOPs) and search engines (VLOSs) which are described as online platforms/search engines which have a number of average monthly active recipients of the service in the EU equal to or higher than 45 million).

Among the most interesting provisions of the DSA are the Article 34 risk assessments and Article 35 risk mitigation measures- directed at VLOPs and VLOSs. 

Article 34.1 of the DSA states that: “ Providers of very large online platforms and of very large online search engines shall diligently identify, analyse and assess any systemic risks in the Union stemming from the design or functioning of their service and its related systems, including algorithmic systems, or from the use made of their services.” Risk assessments under Article 34 shall include the following risks: 

(a)“the dissemination of illegal content through their services;
(b)any actual or foreseeable negative effects for the exercise of fundamental rights, in particular the fundamental rights to human dignity enshrined in Article 1 of the Charter, to respect for private and family life enshrined in Article 7 of the Charter, to the protection of personal data enshrined in Article 8 of the Charter, to freedom of expression and information, including the freedom and pluralism of the media, enshrined in Article 11 of the Charter, to non-discrimination enshrined in Article 21 of the Charter, to respect for the rights of the child enshrined in Article 24 of the Charter and to a high-level of consumer protection enshrined in Article 38 of the Charter;
(c)any actual or foreseeable negative effects on civic discourse and electoral processes, and public security;
(d)any actual or foreseeable negative effects in relation to gender-based violence, the protection of public health and minors and serious negative consequences to the person’s physical and mental well-being.”

Under Article 35, VLOPs, and VLOSs, shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 34, with particular consideration to the impacts of such measures on fundamental rights. Examples of measures which VLOPs and VLOSs may take to mitigate the risks assessed/identified at in Article 34 risk assessment are set out in Article 35. 

The risk assessments required of VLOPs/VLOSs represents a fantastic opportunity for large platforms to engage in fundamental rights assessments and to assess, either themselves, or with the engagement of external consultancy services, systemic risks stemming from the design of their services. Privacy professionals like ourselves, especially those of us with a background in fundamental rights and public policy, will be very well placed to conduct such assessments. 

Author
Facebook
Twitter
LinkedIn

Send an enquiry

Name
Newsletter Subscribe
Hidden
Hidden
Hidden

Contact Details

Get in touch