Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

 
 
Session Overview
Session
P55: Youth 1
Time:
Friday, 20/Oct/2023:
3:30pm - 5:00pm

Session Chair: Amanda Lenhart
Location: Hopper Room

Sonesta Hotel

Show help for 'Increase or decrease the abstract text size'
Presentations

We’ll Return After These Messages: A Content Analysis of Advertising in Children’s Podcasts

Kalia Vogelman-Natan, Breniel Lemley, Ellen Wartella

Northwestern University, United States of America

The recent rise in popularity of children’s podcasts (aka ‘kidcasts’) presents a new avenue through which the multibillion-dollar advertising industry can target children. Young children are especially vulnerable to advertising, which is also associated with negative effects on children. While scholars and policymakers have addressed the effects of advertising on children in relation to television, research and regulations have not been updated and applied to digital technologies, including podcasts and their hosting streaming services. This study aims to address these gaps by examining the nature and prevalence of advertising in kidcasts. A content analysis was conducted on a sample of 100 kidcasts collected in November 2022. Due to the specific vulnerability of school-aged children to advertising, we chose to focus on podcasts meant for children between the ages of 0-10. Our preliminary findings bring up multiple concerns regarding advertising in kidcasts. The level of exposure to advertisements is worrisome due to the possibility of children being unable to differentiate between ad content and programmatic content, due to age or the lack of visual aids and cues, and their effects should be examined. Considering the limited attention span of children, these are significant time periods that may cause children to lose interest in the kidcast and impact their continued listening. Lastly, the fact that kidcasts mostly feature adult-targeted advertisements raises concern that children may be at risk of exposure to age-inappropriate content.



KIDTECH AND ROBLOX: HOW THE CHILDREN’S ENTERTAINMENT INDUSTRY FRAMES KIDS AND TECHNOLOGY

Maureen Mauk1, Natalie Coulter2, Rebekah Willett1

1University of Wisconsin-Madison, United States of America; 2York University, Canada

The KidTech industry is radically changing the landscape of children’s entertainment as companies like Roblox build converged digital spaces that brands and IPs scramble to join. In this paper we use a critical digital studies lens to understand the affordances and economies of KidTech. Using Roblox as an example, we explore ways that the KidTech industry discursively frames policies, play and content. We employ a political economy analysis primarily based on field notes from the 2023 KidScreen Summit. In this paper we present three main findings:

1) The KidTech industry builds its platforms and policies upon a shifting regulatory landscape and an emphasis on industry self-regulation. Responsibility is offloaded onto parents through the use of parental controls and an understanding that children are not left alone with devices and screen media.

2) Children are framed as using IP content to meet their social and emotional needs, exploring and reaffirming their identity through fan engagement. Companies frame fans’ actions, knowledge and opinions as valuable commodities. Children are potential ‘evangelizers’, with the power to ‘make or break’ a company. Companies position children as having a hypercritical awareness of authenticity.

3) Media content is being redefined as experiences in a convergence of communication and entertainment. Linear media is no longer considered to be a sole and reliable way of engaging with children, instead companies employ complex multimodal strategies to engage young people with the hope that fans will “evangelize” the IP by creating their own content.



DESIGNING ETHICAL ARTIFICIAL INTELLIGENCE (AI) SYSTEMS WITH MEANINGFUL YOUTH PARTICIPATION: IMPLICATIONS AND CONSIDERATIONS

Kanishk Verma1, Tijana Milosevic2, Brian Davis3, James O'higgins Norman4

1DCU Anti Bullying Center, ADAPT Center, School of Computing, Dublin City University; 2DCU Anti Bullying Center, ADAPT Center, Institute of Education, Dublin City University; 3ADAPT Center, School of Computing, Dublin City University; 4DCU Anti Bullying Center, Institute of Education, Dublin City University

While artificial intelligence (AI) enabled systems have shown impressive accuracy in detecting harmful content online, they are still not perfect and do not take into account the perspective of children in their design. The development of AI systems heavily relies on large datasets for training, and creating such datasets involves annotating vast amounts of data. Studies that involve children in dataset development also have their challenges, such as the possibility of re-traumatisation. Therefore, ethical considerations must be taken into account, such as obtaining informed consent, conducting design sessions with children and young people, and addressing implicit and explicit biases in AI filtering, profiling, and surveillance systems. It is crucial to involve children and young people in the design of AI systems that filter content to ensure ethical considerations are met. In this article we discuss the ethical concerns in AI development with children and young people, and also possible techniques that help mitigate such concerns.



EXPLORING PARENTS’ KNOWLEDGE OF DARK DESIGN AND ITS IMPACT ON CHILDREN’S DIGITAL WELL-BEING

Claire Bessant1, Laurel Aynne Cook2, L. Lin Ong3, Alexa Fox4, Mariea Grubbs Hoy5, Pingping Gan6, Emma Nottingham7, Beatriz Pereira6, Stacey Steinberg8

1Northumbria University, United Kingdom; 2West Virginia University; 3California State Polytechnic University, Pomona; 4University of Akron; 5The University of Tennessee, Knoxville; 6Iowa State University; 7University of Winchester, United Kingdom; 8Levin College of Law, University of Florida

Dark design (also known as deceptive design; Colin et al., 2018 and dark patterns; Mathur et al., 2019) is evidenced by “a user interface carefully crafted to trick users into doing things they might not otherwise do” (Brignull, 2022; page 1). Much dark design is constructed with monetization as the primary goal- even in spaces without ecommerce design (e.g., free-to-play apps representing >95% of all mobile apps; Fitton et al. 2021). Many recent dark design strategies are also oriented towards collecting user information. Concerns about children’s vulnerability to inappropriate online marketing and economic fraud, and the impact of organisational data collection upon children’s privacy are increasing (European Commission, 2022; OECD, 2011; OFCOM, 2022). Regulators have begun to recognize, challenge, and fine deceptive design practices aimed at children (e.g., $245 million Epic Games settlement; FTC 2022), however, the scope and extent of dark design practices is such that regulators alone cannot safeguard children from such practices. Parents, who are widely understood to be primarily responsible for children’s online experiences, and children themselves, need to be mindful of and resistant to dark design practices in online spaces. With this in mind, this paper explores the following questions:

(a) What is the influence of dark design (1) across mediums (e.g., apps, video games, social media platforms, websites) and (2) across differently-aged children?

(b) To what extent are parents aware of their children’s exposure to dark design and the risks such exposure poses?

(c) How effective are marketplace and regulatory controls?



 
Contact and Legal Notice · Contact Address:
Privacy Statement · Conference: AoIR 2023
Conference Software: ConfTool Pro 2.6.149
© 2001–2024 by Dr. H. Weinreich, Hamburg, Germany