Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

 
 
Session Overview
Session
Child Safety (traditional panel)
Time:
Friday, 01/Nov/2024:
3:30pm - 5:00pm

Session Chair: Ysabel Gerrard
Location: Discovery Room 1

50 attendees

Show help for 'Increase or decrease the abstract text size'
Presentations

Reading Latent Values and Priorities in TikTok's Community Guidelines for Children

Alex Turvy

Tulane University, United States of America

Given the size of its young user base, TikTok faces mounting pressure to protect these users from harm in a tense geopolitical and legal environment. This study takes a mixed-methods approach to studying TikTok’s Community Guidelines to understand how it constructs this document to balance competing priorities around protecting children on the platform. I offer that TikTok employs three signature and historically unique approaches within its current guidelines: scaffolding rules by age and risk level, segmenting content into buckets with tailored policies, and siloing certain features and content from children without removing them entirely. This three-part framework represents an original contribution for understanding TikTok’s approach that is useful for analyzing other platforms. Further analysis uncovers latent values of positivity, proactivity, and precision encoded within the guidelines that are informed by latent priorities around appeasing external stakeholders, preempting further legal regulation, and fostering a positive public perception among children and their caregivers. Overall, this critical analysis of TikTok’s guidelines demonstrates how policy documents operate as strategic artifacts that diverge from platforms’ technical realities.



The ‘Googlisation’ of the classroom: How does the protection of children’s personal data fare?

Kruakae Pothong1, Louise Hooper2, Sonia Livingstone3, Ayça Atabey4, Emma Day5

1LSE, United Kingdom; 2Garden Court Chambers, United Kingdom; 3LSE, United Kingdom; 4University of Edinburgh, United Kingdom; 5Techlegality, United Kingdom

The use of education technologies (EdTech) in schools has rapidly expanded during COVID-19 due to remote learning requirements. Despite the diversity of EdTech products used in schools, only a handful, including Google Classroom, dominate children’s classrooms in the UK. The way these products operate and process data as part of teaching and learning may expose children to data protection risks with immeasurable consequences for children and their life prospects. This paper demonstrates how these data protection risks can manifest when children use Google Classroom for learning in principle and practice. We conducted a legal analysis of privacy policies and legal terms that applied to Google Classroom and other Google services accessible to children within the Google Classroom environment to demonstrate data protection risks in principle and a socio-technical investigation, using a web browser plug-in, called Lightbeam (for Firefox browser) and Thunderbeam (for Chrome browser), to capture the data flow throughout each child’s user journey, in practice. Through legal analysis of Google’s privacy policies, we identified various data protection risks, including the lack of transparency and purpose specification in data processing. We demonstrated how these risks, in principle, became more tangible in practice with scenarios in which each child’s user journey in and through Google Classroom can be exposed to third-party commercial tracking services. Drawing on an example of effective regulatory enforcement, we demonstrated how the risks to commercial exploitation of children’s personal data in education can be tamed.



RESISTANCE TO THE PARENTAL PANOPTICON

Mathias Klang1, Nora Madison2

1Fordham University, United States of America; 2Children's Hospital of Philadelphia

This paper focuses on the developing panoptic relationship between the parent and teenager or young adult brought about by the introduction of domestic surveillance systems. In particular, the focus is on the attitudes of teenagers toward the increased surveillance regimes and the strategies they employ to adapt to or resist the panoptic gaze. The work brings together the fields of surveillance studies and resistance studies to understand how the concept of domestic surveillance is pushing the boundaries of surveillance culture (Lyon 2018). Additionally, this work contributes to the understanding of everyday resistance within a sophisticated surveillance environment.

Surveillance capitalism has been a significant driver in the development and expansion of data collection from the online to the physical environment. In an attempt to reach further into the previously unmonitored spaces, companies have marketed a range of data collection products for convenience and domestic security. The drive of surveillance capitalism to collect increasing amounts of data has created a mass market for multiple forms of surveillance systems. It has trivialized the installation of internal and external cameras (and microphones), normalized tracking via smartphones and tags, and lowered the barriers to installing overt and covert apps.

The technology constructs a sophisticated digital domestic panopticon that young adults and teenagers are required to negotiate. Through interviews with young adults, this work explores their perspectives on the increased levels of home surveillance and the resistance strategies they employed to protect their privacy.



Who Has the Power?: A Comparative Analysis to Parental Controls on Social Media Platforms

Chelsea Leigh Horne

American University, United States of America

One recent trend in social media platform practices as well as in proposed policy and regulation for online data rights is a turn towards parental controls. While concerns over children’s safety online and complaints about problematic settings are not necessarily new (Horne, 2021; Horne, 2023), there may be an emerging trend in some countries to try to address growing concerns about the impact of social media platforms by setting regulation on parental controls. To that end, more research is currently needed to study parental controls and age-specific settings for teens and children, as these are comparatively newer developments and a topic and platform feature in flux. This paper contributes to this needed research area by focusing on specific definitions and understandings of privacy.

The purpose of this study is two-fold. Many social media platforms offer both age-specific privacy settings to children under certain age, as well as parental controls to manage the account of their children. As a result, this study will examine which parental controls are available and how they are offered across some of the most popular social media platforms: Facebook, Instagram, TikTok, Discord, and YouTube. The dataset for this study will consist of each social media platform’s: 1. default settings of teen and children’s accounts and 2. parental control options. The analysis will broadly consider how each platform defines parental controls via privacy options.



 
Contact and Legal Notice · Contact Address:
Privacy Statement · Conference: AoIR2024
Conference Software: ConfTool Pro 2.6.153
© 2001–2025 by Dr. H. Weinreich, Hamburg, Germany