Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

 
 
Session Overview
Session
P14: Datafication
Time:
Thursday, 19/Oct/2023:
3:30pm - 5:00pm

Session Chair: Soyun Ahn
Location: Whistler A

Sonesta Hotel

Show help for 'Increase or decrease the abstract text size'
Presentations

HACK YOUR AGE: OLDER ADULTS AS PROVOCATIVE AND SPECULATIVE IOT CO-DESIGNERS

Joe Bourne1, Naomi Jacobs1, Paul Coulton1, Clare Duffy2, Rupert Goodwins2, Tom Macpherson-Pope3

1Lancaster University, United Kingdom; 2Civic Digits, United Kingdom; 3Making Rooms, United Kingdom

This research explores whether the co-design of provocative prototypes with older adults can scaffold critical thought concerning ethics, trustworthiness, security and privacy of age-oriented Internet of Things (IoT) products and services, and associated data-driven technologies (DDT). By inviting 15 adults defining themselves as ‘experiencing or anticipating old age’ to co-design IoT and DDT which addressed their hopes and fears for the future, we encouraged them to imagine a revolution in ‘technology for aging’. Three workshops utilized theatre and design research approaches including speculative design (Dunne, 2013) and co-design of provocative prototypes and social design fiction (Pilling, 2019),  to stimulate discussion around imagined futures for aging and technology. Participants modelled the internet as they understood and imagined it. They were then introduced to sensors, actuators and machine learning through interactive demonstrations. Four randomly formed groups ideated ways these technologies could be applied to earlier identified hopes and fears for the future of aging. Creative technologists then created prototypes of these ideas over two weeks of feedback and iteration with participants. Participants wrote and performed performances incorporating these prototypes, which explored cybersecurity and cyberharm (Agrafiotis, 2019). Six participants also partook in post-workshop semi-structured interviews. Methods developed in this research scaffolded critical thought concerning the ethics, trustworthiness, security and privacy of age oriented IoT, and associated DDT, regardless of experience or existing knowledge. Participants found it easy to interrogate the ethics, privacy and security of their speculations because, while they may not have been technically scalable or feasible, they understood them. 



Defending human rights in the era of datafication

Maria Normark1, Karin Hansson2, Mattias Jacobsson2

1Uppsala University, Sweden; 2Södertörn University, Sweden

In this paper, we explore how activists and human rights defenders deal with datafication. This work demonstrates how data can be a valuable resource in activism and campaign planning. In addition, data and lack of data also complicate daily life for people in vulnerable positions, for example, when contacting government agencies, schools, and medical facilities .

Data from four types of human rights activism formed the basis of our analysis. They include volunteers and employees of NGOs dealing with refugee and migrant issues, homelessness, poverty, sexual minorities, and women's shelters. The study was done in Sweden, where the GDPR (General Data Protection Regulation) laws limit the handling and storage of personal data.

The following five major themes emerge from the analysis of data from our interview study: Affording personal integrity, Data poverty, Protective data practices, Drawing attention to data, and Systems and data routines.

In addition, this study shows how activists and the organizations that they support are exposed to contradictory aspects of data; on one hand, deliberately exposing data about marginalized/minoritized groups, while on the other, making sure those groups, along with activists themselves, are not exposed. Most important, the data laws and regulations are not adjusted to the needs of the most vulnerable in society, and therefore, actions of civil disobedience are necessary to care for vulnerable populations through data.



Affective datafication for you!: The evolution of platforms' repackaging of user data through the ritualised affect and aesthetics of Spotify Wrapped

Tim Highfield

University of Sheffield, United Kingdom

The everyday nature of datafication, where daily experiences generate data that can be tracked, measured, and quantified, from social media posts to in-store purchases, has also led to the companies and platforms doing the tracking finding ways to represent this data back to users.One particularly prominent form of repackaged user data is the ‘year in review’, where a retrospective account of what the individual did over the calendar year is presented, often in ways that obscure the surveillant and capitalist underpinnings of the platform. One of the most prominent mass data drops is Spotify’s yearly ‘Wrapped’ feature, where users get their listening habits repackaged in an easily-shareable form that highlights their top songs and artists of the year. The release of Spotify’s yearly ‘Wrapped’ feature is an eagerly-awaited time in the digital media calendar, hyped further by pre-Wrapped notifications within the platform and the promotion through the #SpotifyWrapped hashtag. Using a historical analysis of Wrapped, corporate blog posts and other platform communication, and related press coverage since 2020, the paper traces how Wrapped offers aesthetic and affective representations of datafication, often using time and memory as hooks for reminding users of how they engage with the platform. Through the case of Spotify Wrapped, this research seeks to uncover new insight into how platforms repurpose user data, repackaged as memories and reviews, and what this says about how platforms work and how they engage with their users.



Exploitation and Platform Power

Daniel Susser

Penn State University, United States of America

Big tech “exploits” us. This has become a common refrain among critics of digital platforms. It gives voice to a shared sense that technology firms are somehow mistreating people—taking advantage of us, extracting from us—in a way that other data-driven harms, such as surveillance and algorithmic bias, fail to capture. But what does “exploitation” entail, exactly, and how do platforms perpetrate it? What would a theory of digital exploitation add to existing discussions about platform governance?

In the first part of this paper, I draw from the philosophical literature to define exploitation and to show that this language is used to level two distinct, but related, allegations: one about “transactional exploitation,” the other about “structural exploitation.” Underneath debates about how to rein in big tech’s power are fundamental questions about the possibility of liberal reform vs the need for deeper structural change. Distinguishing between transactional and structural exploitation can shed light on the places where incremental reforms hold promise and the places where more radical transformation is necessary.

In the second part of the paper, I explore the roles platforms play in facilitating exploitation and show why viewing platform harms through this lens is a helpful guide for governance. I argue platforms can “enhance,” “transform,” or “invent” exploitative social relationships. Where platforms enhance exploitative practices, we may already have useful strategies that can be adapted to the digital context. Where platforms transform existing relationships into exploitative ones or invent new exploitative social relationships, off-the-shelf approaches are less likely to work.



 
Contact and Legal Notice · Contact Address:
Privacy Statement · Conference: AoIR 2023
Conference Software: ConfTool Pro 2.6.149
© 2001–2024 by Dr. H. Weinreich, Hamburg, Germany