THE DIGITAL CLOSET: PLATFORM CENSORSHIP AND LGBTQIA+ INDEPENDENCE ONLINE
DOI:
https://doi.org/10.5210/spir.v2021i0.11987Keywords:
Content Moderation, Image Recognition, Pornography, Censorship, LGBTAbstract
This presentation draws on data from my forthcoming book with MIT Press to demonstrate how heteronormative and cisnormative bias pervade Silicon Valley culture, get embedded in benchmark datasets and machine learning algorithms, and get formalized in company policies and labor practices surrounding content moderation. The presentation begins with an examination of workplace culture at Google, gaining insights from Department of Labor investigations, testimonials from previous employees, and informal surveys and discourse analysis conducted by employees during the circulation of James Damore's infamous 'Google memo'. The presentation then moves on to examine bias embedded in benchmark datasets like WordNet and ImageNet, both of which served as the training datasets for Google's Image Recognition algorithms (like GoogLeNet). Lastly, the presentation turns to Facebook's heteronormative and cisnormative content moderation policies and the outsourced labor practices it uses to institute what Facebook has described as 'human algorithms' to review content in accordance with these policies. Throughout the presentation I demonstrate that we can piece together information about proprietary code by looking to leaked documents, public records, press releases, open-source code, and benchmark datasets, all of which, in this instance, instigate a systemic heteronormative and cisnormative bias that is increasingly being embedded in the internet.Downloads
Published
2021-09-15
How to Cite
Monea, A. P. (2021). THE DIGITAL CLOSET: PLATFORM CENSORSHIP AND LGBTQIA+ INDEPENDENCE ONLINE. AoIR Selected Papers of Internet Research, 2021. https://doi.org/10.5210/spir.v2021i0.11987
Issue
Section
Papers M