Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

 
 
Session Overview
Session
Communities & Innequalities - Remote
Time:
Friday, 17/Oct/2025:
4:00pm - 5:30pm

Session Chair: Gabriel Pereira
Location: Room 11a - Groundfloor

Novo IACS (Instituto de Arte e Comunicação Social) São Domingos, Niterói - State of Rio de Janeiro, 24210-200, Brazil

Show help for 'Increase or decrease the abstract text size'
Presentations

X to Bluesky platform migration: Governance and community

Anne Oeldorf-Hirsch

University of Connecticut, United States of America

The results of the 2024 U.S. Presidential election led millions of X (Twitter) users to move to other platforms, following Musk’s support for president-elect Trump. Most users fled to Bluesky, driven by a desire for a community space with greater transparency, stronger moderation policies, and less hate speech. While online platform migrations happen regularly, they are not always this acute or this large. Therefore, this migration presents a unique case study for building online platform migration theory.

This project investigates the X to Bluesky migration from a communication media lens. It is hypothesized (per media system dependency theory) that the social turmoil of the recent US Presidential election should lead to greater dependency on X, which would normally lead to greater platform adoption. However, it is theorized that given platform communication violations (per interpersonal expectancy violation theory), users migrated to other platforms (e.g., Bluesky) given a variety of “push” and “pull” factors (per migration theory), and moderated by the critical mass of their network.

The present study will present survey results of former X/Twitter users (N = 200) who are now on Bluesky, about their reasons for migrating, to test these hypotheses. The project has received funding and is currently under university ethics board review. Data collection and analysis will be complete by the AoIR conference in October 2025.



Patchwork Governance on KidTok: Balancing Regulation and Community Norms

Alex Turvy1, Crystal Abidin2

1Tulane University, United States of America; 2Curtin University, Australia

TikTok's rapid growth among young users has introduced unique challenges to existing frameworks for understanding child internet fame. We identify 'KidTok' as a unique networked public for 'internet famous' young people on TikTok shaped by the platform's sociotechnical environment and explore the novel risks that governance should address. While concerns about privacy, safety, and exploitation persist across platforms, TikTok's algorithm-driven exposure and engagement features, such as 'duets' and 'stitches,' have created a distinctive environment that shapes 'KidTok.' This study examines the novel risks associated with child fame on TikTok and evaluates the governance mechanisms addressing these risks. Using a triangulated methodology, we conduct a policy analysis of legal and platform regulations alongside two ethnographic case studies to explore how community-driven processes like lateral surveillance and peer policing complement formal governance efforts. Framing our analysis within Merton's sociology of deviance, we argue that KidTok operates under a patchwork governance model where legal, platform, and community mechanisms interact to navigate acceptable practices for children. These findings highlight the limitations of traditional regulatory frameworks and emphasize the critical role of community norms. Our study offers a framework for researchers and policymakers to better understand and address the governance needs of child internet fame.



Fragmented Flows: Algorithmic Curation, Organic Sharing, and the Structuring of Telegram’s Fringe Communities

Giovanni Boccia Artieri, Nicola Righetti, Valeria Donato

University of Urbino Carlo Bo, Italy

The fragmentation of the digital public sphere has reshaped information circulation, with algorithmic curation and organic sharing playing distinct roles in shaping online discourse. Telegram, with its minimal moderation and encryption features, has emerged as a key hub for alternative narratives, particularly in Italy, where distrust in traditional media fuels the rise of counter-publics.

This study examines how Telegram’s recommendation system, launched in April 2024, structures content flows compared to organic sharing behaviors, addressing two key questions: (1) Is there structural homology between Telegram’s algorithmic similar-channel network and organic networks formed through forwarding, link-sharing, and domain-sharing? (2) What are the key dynamics of content circulation within Italy’s fringe Telegramsphere?

Using a multiplex network analysis approach, we find weak structural overlap between algorithmic recommendations and organic content-sharing networks. The similar-channel network, shaped by audience overlap, is diffuse, while domain-sharing is clustered around common sources. Forwarding integrates external channels, whereas link-sharing reflects internal community interests.

Despite their differing interests, fringe communities partly rely on shared content sources, including mainstream media, while forwarding and link-sharing highlight specific content preferences. These findings underscore the need for further research into shared sources and the interplay between algorithmic curation and organic dissemination, with broader implications for digital public discourse.



Think better, you dumbass: Online hateful speech as epistemic violence

Esteban Morales1, Jaigris Hodson2, Victoria O'Meara3

1University of Groningen, Netherlands, The; 2Royal Roads University, Canada; 3University of Leicester, United Kingdom

Online violence and abuse pose significant challenges to public discourse, as it exacerbates existing power structures and marginalizes diverse epistemic perspectives. In this context, this study examines the epistemological consequences of hateful and toxic speech in online news comment sections, conceptualizing it as a form of epistemic violence—an effort to erase particular ways of thinking. Examining a dataset of toxic and hateful comments from The Conversation Canada, our findings emphasize four mechanisms of epistemic violence: insulting, labelling, ridiculing, and dehumanizing. These mechanisms function to delegitimize alternative epistemic positions and reinforce ideological conformity. Furthermore, these mechanisms disproportionately target those with marginalized identities along racial, gender, and political lines, further entrenching hegemonic power structures. Our research contributes to scholarship on digital epistemologies and platformized violence, highlighting the need for strategies that foster epistemic pluralism rather than simply suppressing toxic discourse.