Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).
Trusted Machines? Machine Learning, More-than-Human Speed and Dead Labor in Platform Capitalism
Massey University, New Zealand
Decision making machines are today ‘trusted’ to perform or assist with a rapidly expanding array of tasks. Indeed, many contemporary industries could not now function without them. Nevertheless, this trust in and reliance upon digital automation is far from unproblematic. This paper combines insights drawn from qualitative research with creative industries professionals, with approaches derived from software studies and media archaeology to critically interrogate three ways that digital automation is currently employed and accompanying questions that relate to trust. Firstly, digital automation is examined as a way of saving time and/or reducing human labor, such as when programmers use automated build tools or graphical user interfaces. Secondly, automation enables new types of behavior by operating at more-than-human speeds, as exemplified by high-frequency trading algorithms. Finally, the mode of digital automation associated with machine learning attempts to both predict and influence human behaviors, as epitomized by personalization algorithms within social media and search engines.
While creative machines are increasingly trusted to underpin industries, culture and society, we should at least query the desirability of increasing dependence on these technologies as they are currently employed. These for-profit, corporate-controlled tools performatively reproduce a neoliberal worldview. Discussing misplaced trust in digital automation frequently conjures an imagined binary opposition between humans and machines, however, this reductive fantasy conceals the far more concrete conflict between differing technocultural assemblages composed of humans and machines. Across the examples explored in this talk, what emerges are numerous ways in which creative machines are used to perpetuate social inequalities.
2:20pm - 2:40pm
MAKING THE UNSEEN VISIBLE: EXPLORING CROSSCUTTING SOCIAL MEDIA PUBLICS AND THEIR SOCIOPOLITICAL TRAITS
Jakob Bæk Kristensen
University of Canterbury, New Zealand
This paper proposes an approach for studying the sociopolitical traits of multiple publics on Facebook that emerge in the network of interactions between users and public pages. The study is based on a survey of 1697 Danish citizens whose responses are coupled with their public Facebook activity.This is used to make predictions about a selection of sociopolitical features for a random sample of 50.000 Facebook users across more than 20.000 public pages. The interactions of the 50.000 users are modeled as a network and a clustering algorithm is used to find groups that arise naturally within said network. This allows for the study of how certain sociopolitical features cut across different congregations of the public in a way that retains a lot of the complexity of the digital trace data.
Results show that voting intention overlaps most strongly with the clusters in the network, followed by gender and geo-location. Additionally they show that the so-called political echo-chambers consist only of smaller subsections of the entire network with many users' interactions mainly being identified by interests that can be attributed to gender, geo-location or other. Although, results also show that the political alt. right are very dominant on hot button political issues such as immigration and religion.
It is proposed that by eliciting sociopolitical trends while considering the full network of interactions might lead researchers to overlook and overestimate fewer features when studying the formation of social media publics.
2:40pm - 3:00pm
COMPLEX ECOLOGIES OF TRUST: FEELINGS ABOUT THE DATA PRACTICES OF A PUBLIC SECTOR BROADCASTER
Robin Elizabeth Steedman1, Helen Kennedy1, Rhianne Jones2
1University of Sheffield, United Kingdom; 2BBC
This paper reports on how trust emerged as an important theme in research into how people feel about the mining of their personal data by the UK’s biggest public service broadcaster,
as a result of the recent requirement that audiences must sign in to access key digital services, introduced in 2018. In focus group research, we found that sometimes, feelings about data practices related to how much people trust the organisation doing the data collecting and analysing and not to the organisation’s data practices. Some participants trusted what the BBC says about how it will use their data, or its intention to protect their data, but not its ability to do so, because of insecurities in the wider data ecosystem. Anxiety about the safety of the Internet in general limited some participants’ trust in BBC data practices, regardless of feelings about this organisation. Alternative data management models which give users control over their data were felt to be preferable to existing models. However, participants did not want to be personally in charge of their data and worried about the labour this would entail. Rather, they wanted to be able to trust that their data is safe. In the paper, we describe the diverse feelings we encountered in our research as ‘complex ecologies of trust,’ highlight the importance of the very specific context of our research, and reflect on how our findings disrupt assumptions about how to engender trust and what constitute ‘good’ data practices.