THE GREAT RANDOMIZER: USING VIRTUAL AGENTS FOR AUDITING THE EFFECTS OF YOUTUBE RECOMMENDATION ALGORITHM ON IDEOLOGICALLY-CHARGED NEWS CONTENT DISTRIBUTION
In this paper, we examine the effects of the YouTube recommendation algorithm on the distribution of ideologically-charged news content. For that purpose, we develop a research infrastructure and conduct a series of experiments using virtual agents (n=200) in a fully controlled environment. We specifically look at YouTube recommendations for videos related to the far-right terrorist attack in the German city of Halle in 2019 and examine how these recommendations differ depending on the type and political affiliation of videos watched by the agents. We find that YouTube recommendations are highly randomized that leads to fundamentally different recommendation trajectories under the condition of identical agent activity which was also synchronized to isolate the effect of time. We also find significant discrepancies in recommendations generated in browsers, with the recommendations for Firefox being slightly less randomized than those for Chrome. Finally, our observations suggest that the recommendations for the agents starting with right-leaning news videos are marginally more consistent than those for the mainstream and left-leaning videos.