TRUSTED MACHINES? MACHINE LEARNING, MORE-THAN-HUMAN SPEED AND DEAD LABOR IN PLATFORM CAPITALISM
DOI:
https://doi.org/10.5210/spir.v2019i0.11043Palabras clave:
automation, labor, inequality, neoliberalism, trustResumen
Decision making machines are today ‘trusted’ to perform or assist with a rapidly expanding array of tasks. Indeed, many contemporary industries could not now function without them. Nevertheless, this trust in and reliance upon digital automation is far from unproblematic. This paper combines insights drawn from qualitative research with creative industries professionals, with approaches derived from software studies and media archaeology to critically interrogate three ways that digital automation is currently employed and accompanying questions that relate to trust. Firstly, digital automation is examined as a way of saving time and/or reducing human labor, such as when programmers use automated build tools or graphical user interfaces. Secondly, automation enables new types of behavior by operating at more-than-human speeds, as exemplified by high-frequency trading algorithms. Finally, the mode of digital automation associated with machine learning attempts to both predict and influence human behaviors, as epitomized by personalization algorithms within social media and search engines.
While creative machines are increasingly trusted to underpin industries, culture and society, we should at least query the desirability of increasing dependence on these technologies as they are currently employed. These for-profit, corporate-controlled tools performatively reproduce a neoliberal worldview. Discussing misplaced trust in digital automation frequently conjures an imagined binary opposition between humans and machines, however, this reductive fantasy conceals the far more concrete conflict between differing technocultural assemblages composed of humans and machines. Across the examples explored in this talk, what emerges are numerous ways in which creative machines are used to perpetuate social inequalities.