Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

 
 
Session Overview
Session
Platforms & Governments (traditional panel)
Time:
Friday, 01/Nov/2024:
9:00am - 10:30am

Session Chair: Suay Melisa Özkula
Location: Octagon Council Chamber

80 attendees

Show help for 'Increase or decrease the abstract text size'
Presentations

Gauging platform observability under the EU’s Digital Services Act

Charis Papaevangelou, Fabio Votta

University of Amsterdam, Netherlands, The

The increasing reliance on automated processes for content moderation, driven by regulatory pressures and the imperatives of advertising-based business models, has raised critical issues, especially as regards their effectiveness. This contribution examines the implications of AI-driven content moderation within the EU's Digital Services Act (DSA) framework, focusing on Very Large Online Platforms (VLOPs). It investigates the extent to which insights on automated content moderation across the EU can be discerned from the DSA’s Statement of Reasons (SoRs) database. By analysing content moderation practices across eight VLOPs over four months, using data from the SoRs submissions and transparency reports, this study aims to assess the relationship between automation and countries, exploring disparities in moderation.

Findings indicate a predominance of EU or EEA-level moderation, with variations in the adoption of automation and manual processes across platforms. The study highlights discrepancies in responsiveness across countries and questions the feasibility of swift content moderation, as often required by policymakers. This underscores the critical role of transparency in fostering platform observability, that is a dynamic process to understanding the socio-technical affordances of digital platforms in a way that enables accountability.

However, we also identify the limitations and inconsistencies within the SoRs database and transparency reports. This suggests that, while the DSA mandates a degree of transparency, it may not suffice for comprehensive observability or understanding of content governance practices. In conclusion, the study offers insights into the operational, legal, and ethical dimensions of AI-driven content moderation, contributing to discussions on democratic accountability and platform governance.



Platforms on trial: Mapping the Facebook Files/Papers controversy

Matías Valderrama Barragán

Centre for Interdisciplinary Methodologies, University of Warwick, United Kingdom

Big Tech has been involved in several controversies in recent years. Data leaks, whistleblowers, and social experiments have raised alarms about the dubbed toxic and unaccountable power of platforms and, more broadly, about the increasing crisis of accountability in digital societies (Cooper et al., 2022; Khan, 2018; Marres, 2021). At the intersection of digital sociology, media studies, and STS, this research explores how different actors make and unmake connections between social media platforms and societal harms. The study focuses on mapping a specific platform controversy: the Facebook Files/Papers, a leak in 2021 of internal documents from Meta by the former employee Frances Haugen. The disclosures partially exposed what Meta knew about the consequences of its interface designs, data, and algorithms (Hendrix, 2021; Horwitz, 2021). By combining digital and ethnographic methods, I followed the disclosures across different media settings to analyse how they were made public and which actors, issues, and framings gained prominence during the controversy. As I will show, journalists, advocacy groups, and critics promoted a ‘strategic causalism’ to strengthen the connection between Meta platforms and specific social harms. This was contrasted by the ‘strategic ambiguity’ mobilised by Meta spokespersons to undermine such claims and disperse their responsibility to other actors (e.g. users, malicious actors). The analysis highlights a US-centric theatre of accountability that overlooks crucial issues from the actual disclosures, revealing the asymmetries and displacements when platforms are put on trial.



UNFAIR PLAY: DIGITAL PLATFORM'S ABUSE OF POWER TO INFLUENCE BRAZILIAN POLICY AGENDA

Rose Marie Santini, Bruno Mattos, Débora Salles, Marcela Canavarro

Netlab UFRJ, Brazil

This study examines the indirect lobbying strategies deployed by major digital platforms, to oppose Brazil’s "Fake News Bill", which sought to regulate internet intermediaries in the country. Drawing from primary data collected during the lobbying campaign in April-May 2023, the study analyzes platform ads, search engine recommendations, and direct user communications.

The findings reveal systematic violations of platform policies, including unlabelled political ads and exploitation of exclusive platform affordances unavailable to competitors. Digital platforms leveraged their market dominance and technological tools, aligning with far-right disinformation narratives while mobilizing public opposition under the guise of protecting user interests. These practices highlight the misuse of economic power and self-regulation to resist oversight, emphasizing the need for stricter accountability measures. Future research should investigate the broader influence of such campaigns on public opinion.



Between the Cracks: Blind spots in the EU’s efforts to regulate platform opinion power and digital media concentration

Theresa Josephine Seipp1, Natali Helberger2, Claes De Vreese3, Jef Ausloos4

1University of Amsterdam, Netherlands, The; 2University of Amsterdam, Netherlands, The; 3University of Amsterdam, Netherlands, The; 4University of Amsterdam, Netherlands, The

This paper examines the European Union's regulatory strategies, focusing on the European Media Freedom Act (EMFA), to tackle the challenges arising from the increasing concentration of power and the influence exerted by platforms within digital media ecosystems. It scrutinises the pertinent regulations and pinpoints the gaps in effectively mitigating the risks of monopolisation and concentration that contribute to the decline of media pluralism and endanger independent and local journalism. The primary oversight concerns the escalating involvement of platforms as digital infrastructure and AI providers, which amplifies their economic and political power over the digital media landscapes. Despite the intentions behind current regulations, such as the EMFA, Digital Services Act (DSA), and Digital Markets Act (DMA), they fall short in fully addressing these threats and in promoting a sustainable and pluralistic digital news industry.



 
Contact and Legal Notice · Contact Address:
Privacy Statement · Conference: AoIR2024
Conference Software: ConfTool Pro 2.6.153
© 2001–2025 by Dr. H. Weinreich, Hamburg, Germany