Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

 
 
Session Overview
Session
Privacy 2: Privacy and Surveillance
Time:
Saturday, 05/Nov/2022:
9:00am - 10:30am

Session Chair: Jessa Lingel
Location: CQ-009

150 seat lecture hall

Show help for 'Increase or decrease the abstract text size'
Presentations

Our data bodies. Re-framing our conceptual and normative relationship with personal data

Sara Suárez-Gonzalo, Alejandra López-Gabrielidis

Universitat Oberta de Catalunya, Spain

The rapid development of data-driven technologies has led to manifold concerns on how to protect citizens from the effects of massive data exploitation, especially when “personal data” are at stake, since they are the object of a fundamental right. Personal data are legally defined as any information “relating to” an identified or identifiable natural person, the right holder. However, it is unclear in which particular sense personal data relate to a person: which is the specific type of direct connection between them. In our view, this is a crucial question both from a conceptual and a normative point of view that the academic literature has not yet addressed in enough depth. In light of this, our paper aims at advancing knowledge towards a more accurate conception of our relationship with personal data. To do so, we firstly discuss the main recent conceptions on the issue: data as property, data as raw material and data as labour. Secondly, we argue why none of them seems to be accurate from a conceptual point of view or desirable from a socio-political perspective. Finally, we propose to understand personal data as part of oneself and our relationship with them as the one we have with our bodies or their parts, with our data bodies. We end up discussing the implications of the proposed conception, both from a conceptual and from a normative perspective.



Qualitative Evidence on Chilling Effects—How Users’ Imaginaries of Dataveillance Lead to Inhibited Digital Behavior

Kiran Kappeler, Noemi Festic, Michael Latzer

University of Zurich, Switzerland

Our everyday life has become highly digitized, and all our online actions leave digital data traces that are automatically collected, aggregated, and analyzed. Emerging research shows that this dataveillance can lead to chilling effects on mundane and legitimate information and communication behaviors. However, the extent to which a sense of dataveillance deters individuals from freely engaging in legitimate digital behaviors remains empirically understudied. Therefore, we ask: How do internet users imagine dataveillance? And how does this perception lead to people limiting their digital behavior?

To address these questions, we conduct qualitative semi-structured interviews with 15 individuals. Our results contribute to a better understanding of people’s imaginaries of dataveillance, as well as sources for these perceptions and reactions towards them in terms of chilled digital behaviors. We show how imagined sources of dataveillance can range from states to corporate actors and how knowledge about dataveillance can originate in news coverage on data scandals or own experiences. Also, we identify further legitimate, mundane behaviors in other life domains like participating in online dating and buying things online that are affected by a sense of dataveillance and have so far been neglected in research on chilling effects.

These results contribute to an empirically founded understanding of the nature of a sense of dataveillance and resulting chilling effects on digital behavior. They provide the basis for a representative survey and a mobile experience sampling study as part of a mixed-methods research design investigating chilling effects on a population level.



Voices from the margins: Privacy discourse in GitHub README files

Keren Levy-Eshkol, Rivka Ribak

University of Haifa, Israel

Most digital products use technologies that enable the collection, analysis and transfer of a considerable amount of personal information to the cloud, where it is stored and processed. Thus much of the responsibility for the users’ personal information is in the hands of the developers and the code that they author. This research adopts a materialist perspective to study developers’ discourse around the privacy solutions they embed in the code. Defining GitHub as a discursive platform, we draw on a sample of almost 60,000 README files to analyze the ways in which developers present code to other developers. We find that the files promote two approaches, privacy-by-policy and privacy-by-design. We suggest that the distinction between the two is in fact a distinction between developers who uphold privacy as a value and developers who regard it as an imposition to comply with.



COLONIZERS IN THE NEIGHBORHOOD: A CONTENT ANALYSIS OF NEXTDOOR USERS’ “POSTRACIAL” SURVEILLANCE

Jenny Lee, Chloe Ahn

Annenberg School for Communication, University of Pennsylvania

Today, colonizing the neighborhood is more accessible to the everyday individual than ever before with the widespread accessibility and popularity of hyper-local digital platforms like Nextdoor. Our project investigates how individuals use these platforms to better understand how their colonial practices are reshaped and perpetuated in the era of the networked smart home. Through a content analysis of user-generated posts in West Philadelphia, we find that users rarely engage with explicitly racist language or topics. Rather than interpret this finding as an indication that Nextdoor users are not racist, or that the platform’s content moderation strategies prevent racism, we argue that users, instead, rely on postracial practices to simultaneously normalize and obscure anti-Blackness. Drawing from Mukherjee et al.’s (2019) concept of “postrace,” we find that users commonly engage in three practices. First, users increasingly shifted away from explicit racial identifiers to more general "race-neutral" goals of safety and community. Second, users shied away from using problematic but non-explicit racialized language (e.g., "thugs"), opting to embed their racializing opinions in broader “policy” discussions. Third, even with increased attention on the harms of surveillance technologies on people of color, discussions around monitoring remained relatively consistent. These themes reflect postracial racism as yet another iteration of colonization in the neighborhood, entrenching essentializing tactics, perpetuating carceral logics, and developing and perfecting surveillance through the site of Blackness.



 
Contact and Legal Notice · Contact Address:
Privacy Statement · Conference: AoIR2022
Conference Software - ConfTool Pro 2.6.145
© 2001–2022 by Dr. H. Weinreich, Hamburg, Germany