Conference Agenda

Session
Digital Policy 01: A New Constellation of EU Regulatory Authorities in the Digital Realm
Time:
Monday, 01/Sept/2025:
9:00am - 10:30am

Session Chair: Evangelia Psychogiopoulou
Discussant: Giuseppe Mazziotti
Session Chair: Federica Casarosa

Presentations

A New Constellation of EU Regulatory Authorities in the Digital Realm

Chair(s): Evangelia Psychogiopoulou (University of the Peloponnese), Federica Casarosa (European University Institute)

Discussant(s): Giuseppe Mazziotti (Universidade Catolica Portuguesa)

This panel is devoted to the study of the new set of regulatory bodies established by a range of recently enacted EU digital policy acts. It examines their nature and mission, and explores their competences, tasks, independence and relationship with national competent authorities. The analysis also delves into the interaction and interrelationship of the bodies under study, potential regulatory overlaps, and collaboration with other stakeholders. The panel sheds light on the Union’s governance framework for the digital realm and probes the challenges deriving from the establishment of a multiplicity of bodies for the implementation and enforcement of the rules enacted.

 

Presentations of the Symposium

 

Regulating Cybersecurity: The Role of Independent Authorities

Federica Casarosa
European University Institute

The last decade has seen the adoption of several legislative acts focused on the cybersecurity perspective: from the Cybersecurity Act (Regulation 2019/881) to the NIS 2 Directive (Directive 2022/2555), the Digital Operational Resilience Act (Regulation 2022/2554), the Cyber Resilience Act (Regulation 2024/2847) and the Cyber Solidarity Act (Regulation pending final approval). Their structure and organisation are convergent and rely strongly on creating a set of European and national authorities in charge of tackling the risks and challenges that emerge from an increasing number and impact of cyber threats. However, this structure, on the one hand, partially overlaps with some competencies already covered by pre-existing national authorities, such as data protection authorities. On the other hand, it distinguishes between monitoring functions across allocated pre- and post-market availability of services and goods, increasing complexities and coordination. This contribution maps the roles and functions of the independent administrative authorities whose creation has been triggered by the recent cybersecurity legislation highlighting the potential gaps and overlaps that emerge looking at the wider digital market perspective.

 

Intersecting Orbits: The Interplay of the European Media Freedom Act and Digital Services Act in Shaping Media Governance

Elda Brogi, Iva Nenadic, Matteo Trevisan
European University Institute

In 2024 the European Union approved Regulation (EU) 2024/1083 of the European Parliament and of the Council of 11 April 2024 establishing a common framework for media services in the internal market and amending Directive 2010/13/EU (European Media Freedom Act), the so called European Media Freedom Act (EMFA). This is a groundbreaking regulation as it recognises media freedom and pluralism as crucial for democracy and the rule of law within a well-functioning internal market. The Regulation also reforms the public governance of the media sector and establishes a European Board for Media Services (EBMS), to ensure that the EMFA and other relevant Union media law (such as the Audiovisual media services directive) are consistently applied across the EU. The Board is envisaged as an independent body gathering national regulatory authorities or bodies and coordinating their actions. The Article will delve into the nature of the EBMS, its tasks, the challenges it faces when it comes to its independence and effectiveness. The Article will also focus on the role of the EBMS in the enforcement of the EMFA, also exploring the interaction and potential regulatory overlaps of the EMFA with with the Digital Services Act and of the tasks of the EBMS with those of the Digital Services Coordinators and the European Board for Digital Services, established by the same DSA.

 

EU Values and AI Governance: A European Model in the Making?

Evangelia Psychogiopoulou
University of the Peloponnese

In 2024 the EU adopted Regulation 2024/1689 laying down harmonised rules on artificial intelligence (the Artificial Intelligence Act). The AIA aspires to foster responsible and trustworthy AI development and use in line with the Union’s values laid down in Article 2 of the Treaty on European Union. Based on a risk-based classification approach, it lays down requirements for AI developers and deployers and bans particularly harmful uses of AI. It also establishes an intricate governance model for the implementation and enforcement of the rules enacted. Besides requiring the Member States to designate distinct national authorities for supervising the application of the rules introduced, the AIA creates a European AI Office within the European Commission to oversee implementation, with a focus on general-level AI models and systems, AI regulatory sandboxes, joint enforcement and global AI governance. It also provides for a European AI Board to ensure coordination and consistency in implementation, and sets up an Advisory Forum for the provision of technical expertise and a Scientific Panel tasked with offering scientific advice. This paper delves into the governance structure of the AIA. It studies the nature and composition of the bodies involved, and examines their mission, tasks, safeguards for independence and impartiality, their interaction and avenues for collaboration with stakeholders and a broader set of regulators, agencies, standardisation entities and other bodies at national and supranational levels. In doing so, the paper aspires to shed light on the ability of the AIA governance model to genuinely uphold the Union’s values.

 

The Digital Services Act in the Multi-Actor System of Fundamental Rights Protection

Valentina Golunova
Maastricht University

Online platforms have emerged as the key regulators of speech in the digital domain. Their content moderation practices pose serious threats to freedom of expression. These threats may be exacerbated by public authorities which increasingly require platforms to act upon both illegal and harmful content. In response to this challenge, the Digital Services Act (DSA) sets out a multidimensional system of fundamental rights protection. Yet difficult questions concerning the allocation of responsibility for protecting freedom of expression persist. This contribution investigates the scope and content of fundamental rights obligations incumbent on the EU, the Member States, and online platforms under the DSA through the lens of the CJEU’s case law. It finds that the CJEU’s reflections are extremely relevant for resolving many of the current issues, such as the extent of the obligations borne by the European Commission and national Digital Services Coordinators (DSCs) to protect freedom of expression against the backdrop of other competing rights. At the same time, it exposes many remaining uncertainties regarding the division of fundamental rights obligations given the complex interplay between public and private power in the online dimension.