文献检索文档翻译深度研究
Suppr Zotero 插件Zotero 插件
邀请有礼套餐&价格历史记录

新学期,新优惠

限时优惠:9月1日-9月22日

30天高级会员仅需29元

1天体验卡首发特惠仅需5.99元

了解详情
不再提醒
插件&应用
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
高级版
套餐订阅购买积分包
AI 工具
文献检索文档翻译深度研究
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2025

法庭中的算法:司法决策的哪个部分实现自动化重要吗?

Algorithms in the court: does it matter which part of the judicial decision-making is automated?

作者信息

Barysė Dovilė, Sarel Roee

机构信息

Institute of Psychology, Vilnius University, University Str. 9, 01513 Vilnius, Lithuania.

Institute of Law and Economics, University of Hamburg, Johnsallee 35, 20148 Hamburg, Germany.

出版信息

Artif Intell Law (Dordr). 2023 Jan 8:1-30. doi: 10.1007/s10506-022-09343-6.


DOI:10.1007/s10506-022-09343-6
PMID:36643574
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC9826621/
Abstract

Artificial intelligence plays an increasingly important role in legal disputes, influencing not only the reality outside the court but also the judicial decision-making process itself. While it is clear why judges may generally benefit from technology as a tool for reducing effort costs or increasing accuracy, the presence of technology in the judicial process may also affect the public perception of the courts. In particular, if individuals are averse to adjudication that involves a high degree of automation, particularly given fairness concerns, then judicial technology may yield lower benefits than expected. However, the degree of aversion may well depend on how technology is used, i.e., on the timing and strength of judicial reliance on algorithms. Using an exploratory survey, we investigate whether the stage in which judges turn to algorithms for assistance matters for individual beliefs about the fairness of case outcomes. Specifically, we elicit beliefs about the use of algorithms in four different stages of adjudication: (i) information acquisition, (ii) information analysis, (iii) decision selection, and (iv) decision implementation. Our analysis indicates that individuals generally perceive the use of algorithms as fairer in the information acquisition stage than in other stages. However, individuals with a legal profession also perceive automation in the decision implementation stage as less fair compared to other individuals. Our findings, hence, suggest that individuals do care about how and when algorithms are used in the courts.

摘要

人工智能在法律纠纷中发挥着越来越重要的作用,不仅影响法庭之外的现实情况,还影响司法决策过程本身。虽然很明显法官为何通常可从技术作为降低工作量成本或提高准确性的工具中受益,但司法过程中技术的存在也可能影响公众对法院的看法。特别是,如果个人反感高度自动化的裁决,尤其是考虑到公平性问题,那么司法技术可能产生比预期更低的效益。然而,反感程度很可能取决于技术的使用方式,即取决于法官对算法的依赖时机和程度。通过一项探索性调查,我们研究法官求助于算法提供协助的阶段对于个人对案件结果公平性的看法是否重要。具体而言,我们引出了在裁决的四个不同阶段对算法使用的看法:(i) 信息获取,(ii) 信息分析,(iii) 决策选择,以及 (iv) 决策执行。我们的分析表明,个人通常认为在信息获取阶段使用算法比在其他阶段更公平。然而,与其他个人相比,从事法律职业的个人也认为在决策执行阶段的自动化不太公平。因此,我们的研究结果表明,个人确实关心算法在法庭上的使用方式和时间。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ced4/9826621/d09e18933c8a/10506_2022_9343_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ced4/9826621/859da1f2041d/10506_2022_9343_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ced4/9826621/d09e18933c8a/10506_2022_9343_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ced4/9826621/859da1f2041d/10506_2022_9343_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ced4/9826621/d09e18933c8a/10506_2022_9343_Fig2_HTML.jpg

相似文献

[1]
Algorithms in the court: does it matter which part of the judicial decision-making is automated?

Artif Intell Law (Dordr). 2023-1-8

[2]
Judicial breakfast as an external factor in judicial decision making in courts.

F1000Res. 2023

[3]
Inequality threat increases laypeople's, but not judges', acceptance of algorithmic decision making in court.

Law Hum Behav. 2024

[4]
Educators as Judges: Applying Judicial Decision-Making Principles to High-Stakes Education Assessment Decisions.

Teach Learn Med. 2023

[5]
Why do people choose courts to resolve disputes? A fuzzy-set analysis of Chinese citizens' judicial reliance.

Front Psychol. 2023-1-5

[6]
Automated Justice: Issues, Benefits and Risks in the Use of Artificial Intelligence and Its Algorithms in Access to Justice and Law Enforcement

2022

[7]
Baby doe redux? The Department of Health and Human Services and the Born-Alive Infants Protection Act of 2002: a cautionary note on normative neonatal practice.

Pediatrics. 2005-10

[8]
Humans versus machines: Who is perceived to decide fairer? Experimental evidence on attitudes toward automated decision-making.

Patterns (N Y). 2022-9-29

[9]
Perceptions of Justice By Algorithms.

Artif Intell Law (Dordr). 2023

[10]
The judicial role in life-sustaining medical treatment decisions.

Issues Law Med. 1991

引用本文的文献

[1]
Judges versus artificial intelligence in juror decision-making in criminal trials: Evidence from two pre-registered experiments.

PLoS One. 2025-1-30

本文引用的文献

[1]
Perceptions of Justice By Algorithms.

Artif Intell Law (Dordr). 2023

[2]
Artificial fairness? Trust in algorithmic police decision-making.

J Exp Criminol. 2023

[3]
Judicial analytics and the great transformation of American Law.

Artif Intell Law (Dordr). 2019

[4]
Are algorithms good judges?

Science. 2018-1-19

[5]
A model for types and levels of human interaction with automation.

IEEE Trans Syst Man Cybern A Syst Hum. 2000-5

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

推荐工具

医学文档翻译智能文献检索