Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

 
Session Overview
Session
PaperSession-21: Platform Lives
Time:
Friday, 04/Oct/2019:
11:00am - 12:30pm

Session Chair: Yi-Ning Chen
Location: P512
(cap. 96)

Show help for 'Increase or decrease the abstract text size'
Presentations
11:00am - 11:20am

A balancing act: Putting up bookshelves on a social media platform

Anne-Mette Bech Albrechtslund

Independent, Denmark

This paper addresses the theme of this year’s conference by focusing on the issue of trust in the uneasy relationship between content-generating users and commercially owned platforms. In this paper, I will present a case study of the use of so-called ‘bookshelves’ on Goodreads which offers a distinct example illustrating how a social media platform will often tests its users’ trust as it seeks ways to capitalize on its users’ engagement and contributions while still preserving the atmosphere of a social network built for and with its users. Bookshelves constitute a key element in making Goodreads a space users can feel ‘at home’ in. In the paper, I show how this becomes clear when considering users responses’ to Goodreads policies on bookshelves. I focus specifically on three different examples of negotiations of trust that have emerged in relation to the platform’s policies on bookshelves. The study draws particularly on discourse analysis with a focus on narrative and organizational metaphors, and the aim is to identify central points of ambivalence in users’ attempts to balance their trust in the platform.



11:20am - 11:40am

Sunsets and Memories: How We Bury and Mourn Dead Platforms

Emily van der Nagel

Monash University, Australia

Over social media’s first decade, we as users grew to trust that platforms play a role as memory machines: they enable us to share and store our media traces to look back on later, as we remember and make sense of our lives.

But not all platforms last forever. What happens when social media platforms are, to borrow a business term, sunsetted? This paper investigates how platforms end, and how people remember them after they are gone. I first conducted a thematic analysis of 20 sunset posts: the final declaration of what a platform has been. I discovered that this genre of communication is designed to spark a sense of loss for the platform that was, and trust in the people who are moving on to new projects.

This opened up further questions about if people remember dead platforms, and if so, how they remember them. Responses to a survey of social media users about a platform they used to use that no longer exists will undergo a thematic analysis to identify common themes and patterns in terms of online remembering, nostalgia, archiving, and forgetting.

As social media platforms are a relatively new form of media, this research project aims to gain an understanding of how people shift from platform to platform, and how media traces and platforms are remembered and forgotten.



11:40am - 12:00pm

Trust Under Trial: The Effect of Surge Pricing on Trust toward Ride-Hailing Platforms in Manila

Godofredo Jr Ramizo

University of Oxford, United Kingdom

The nascent literature on platform trust in contexts of the Global South does not yet explain how trust for technology platforms persist or erode in the presence of issues that invite distrust, such as the controversial use of surge pricing in ride-hailing platforms. This paper uses in-depth interviews with 30 users of ride-hailing platforms in Manila to study how attitudes toward surge pricing influence attitudes for ride-hailing platforms. The paper finds that despite respondents’ negative attitudes toward surge pricing amid doubts on its fairness and transparency, trust for ride-hailing platforms persist. Persistence of trust is partly due to users’ rational experiences indicating that ride-hailing platforms provide net benefits despite the possibility of unfair surge pricing, especially when faced with the disagreeable alternative of using Manila’s poor transport infrastructure. However, the persistence of trust was also due to cognitive biases, as reflected in constructs such as acceptance of limited transparency, perceived control, and the ideational appeal of technological systems. These cognitive biases found in the data increase our understanding why trust in platform technologies may persist even when besieged by distrust. The findings can also increase our vigilance over the various cognitive biases which can be exploited to create trust on less than meritorious grounds, and hold a firm grip on users’ trust even as the latter begin to harbour healthy skepticism over fairness and transparency.



12:00pm - 12:20pm

“Everything is a Recommendation”: Netflix and the Construction of Blackness Through Algorithms

Daniel Meyerend

University of Michigan, United States of America

How do algorithms construct our racial identities, and should recommendation systems include race as a variable in order to market to users? In the case of Netflix, spokespersons insist that the only demographic information they have about users is what they provide in terms of a name, billing information, and location, going on to claim that any type of personalization (which would include thumbnails) is a result of user choices and viewing practices. However, emergent scholarships within algorithm studies speaks to how user behavior can train the algorithm to produce differences by race even if they were not included in the programming of the system. However, user behavior can also train an algorithmic system to produce problematic outputs in the context of machine learning. Using the Netflix debacle as a case study, I will further this conversation by exploring how race is still a variable and significant factor in the company’s recommendation system algorithms. This work then requires an explanation of how recommendation systems function as well as an inside look into Netflix’s algorithm for recommending visual material to its subscribers. Finally, I will examine black consumer responses and the subsequent reporting by news outlets that covered this moment. By utilizing Erving Goffman’s understanding of “the presentation of self” which focuses on how individuals use different signifiers as a way of constructing a social self, my analysis will show how Netflix transforms “black users” into “black subjects” in order to profit off of the engagement of black consumers.



 
Contact and Legal Notice · Contact Address:
Privacy Statement · Conference: AoIR 2019
Conference Software - ConfTool Pro 2.6.129
© 2001 - 2019 by Dr. H. Weinreich, Hamburg, Germany