Milano Silvia, Prunkl Carina
University of Exeter, Exeter, UK.
Utrecht University, Utrecht, Netherlands.
Philos Stud. 2025;182(1):185-203. doi: 10.1007/s11098-023-02095-2. Epub 2024 Feb 5.
It is well-established that algorithms can be instruments of injustice. It is less frequently discussed, however, how current modes of AI deployment often make the very discovery of injustice difficult, if not impossible. In this article, we focus on the effects of algorithmic profiling on epistemic agency. We show how algorithmic profiling can give rise to epistemic injustice through the depletion of epistemic resources that are needed to interpret and evaluate certain experiences. By doing so, we not only demonstrate how the philosophical conceptual framework of epistemic injustice can help pinpoint potential, systematic harms from algorithmic profiling, but we also identify a novel source of hermeneutical injustice that to date has received little attention in the relevant literature, what we call epistemic fragmentation. As we detail in this paper, epistemic fragmentation is a structural characteristic of algorithmically-mediated environments that isolate individuals, making it more difficult to develop, uptake and apply new epistemic resources, thus making it more difficult to identify and conceptualise emerging harms in these environments. We thus trace the occurrence of hermeneutical injustice back to the fragmentation of the epistemic experiences of individuals, who are left more vulnerable by the inability to share, compare and learn from shared experiences.
算法可能成为不公正的工具,这一点已得到充分证实。然而,当前人工智能的部署模式往往如何使不公正的发现变得困难(如果不是不可能的话),却较少被讨论。在本文中,我们关注算法剖析对认知能动性的影响。我们展示了算法剖析如何通过耗尽解释和评估某些经历所需的认知资源而导致认知不公正。通过这样做,我们不仅展示了认知不公正的哲学概念框架如何有助于 pinpoint 算法剖析潜在的、系统性的危害,而且我们还识别出一种新的解释不公正来源,即认知碎片化,迄今为止在相关文献中很少受到关注。正如我们在本文中详细阐述的那样,认知碎片化是算法介导环境的一种结构特征,它将个体孤立起来,使得开发、接受和应用新的认知资源变得更加困难,从而使得在这些环境中识别和概念化新出现的危害变得更加困难。因此,我们将解释不公正的发生追溯到个体认知体验的碎片化,由于无法分享、比较和从共同经历中学习,个体变得更加脆弱。