ALGORITHMIC LEGAL METRICS
DOI:
https://doi.org/10.5210/spir.v2020i0.11184Palabras clave:
law, algorithm, predictive analytics, AI, artificial intelligence, surveillance, data profiling, liability, personalized law, bias, due process, equality, reflexivity, government, governmentality, transparencyResumen
Predictive algorithms are increasingly being deployed in a variety of settings to determine legal status. Further applications have been proposed to determine civil and criminal liability or to “personalize” legal default rules. Deployment of such artificial intelligence systems has properly raised questions of algorithmic bias, fairness, transparency, and due process. But little attention has been paid to the known sociological costs of using predictive algorithms to determine legal status. Many of these interactions are socially detrimental, and such corrosive effects are greatly amplified by the increasing speed and ubiquity of digitally automated algorithmic systems. In this paper I link the sociological and legal analysis of AI, highlighting the reflexive social processes that are engaged by algorithmic metrics. Specifically, this paper shows how the problematic social effects of algorithmic legal metrics extend far beyond the concerns about accuracy that have thus far dominated critiques of such metrics. It additionally demonstrates that corrective governance mechanisms such as enhanced due process or transparency will be inadequate to remedy such corrosive effects, and that some such remedies, such as transparency, may actually exacerbate the worst effects of algorithmic governmentality. Third, the paper shows that the application of algorithmic metrics to legal decisions aggravates the latent tensions between equity and autonomy in liberal institutions, undermining democratic values in a manner and on a scale not previously experienced by human societies. Illuminating these effects casts new light on the inherent social costs of AI metrics, particularly the perverse effects of deploying algorithms in legal systems.