TRUST IN DECONSTRUCTED RECOMMENDER SYSTEMS. CASE STUDY: NEWS RECOMMENDER SYSTEMS
DOI:
https://doi.org/10.5210/spir.v2019i0.11006Keywords:
(dis)trust, socio-technical assembly, news recommender systems, algorithms as culture, deconstructed algorithmic imaginaryAbstract
Increasingly, algorithms play an important role in everyday decision-making processes. Recommender systems, specifically, are algorithms that serve to influence end-users’ decision-making (e.g. what to read, who to befriend, who to rent to…). However, the companies that develop and produce these systems are not neutral, but have an economic goal and specific vision on how society should operate. These algorithms should thus never be trusted blindly.
An algorithm consists of collective human practices and consequently warm human and institutional choices. Therefore, they should be perceived as culture. Despite the many academics that are joining the debate to denounce the bias, opaqueness and unfairness often found in these algorithms, little empirical research has invested in treating algorithms in its socio-technical assembly as culture.
To better understand how end-users perceive these algorithmic systems, we strive to understand how they imagine and (dis)trust the different components of the socio-technical assembly. We are demystifying the imagined processes incorporated in these algorithmic systems in the minds of the end-user using a deconstructed version of Buchers’ (2017) algorithmic imaginary.
Currently, companies put ever more effort into personalizing news, using news recommender systems (NRS). NRS organize, select and aggregate news to influence the decision-making of an end-user without a transparent explanation on the process. Therefore, we focus our study on the end-users of these NRS.
In this qualitative study, we are interviewing 25 end-users of NRS to understand the assumptions and apprehend the (dis-)trust people have about the different elements of the socio-technical assembly of news recommender systems.