Identification of Algorithmic Evidence in Administrative Punishment Cases
Download PDF

Keywords

Automated administration
Algorithm evidence
Identification of evidence
Judicial review

DOI

10.26689/ssr.v6i6.7412

Submitted : 2024-06-11
Accepted : 2024-06-26
Published : 2024-07-11

Abstract

In the field of administrative punishment, algorithmic evidence is the immediate result obtained through the established algorithm in the operation steps of automated decision-making by the government. The intelligibility of this kind of evidence will continue to disappear with the development of artificial intelligence technology. Compared with traditional evidence, algorithmic evidence is highly technical and complicated, and it has the endorsement of public authorities. In judicial practice, only such evidence is reviewed legally. Judges often evade reasoning on technical issues as laymen, resulting in administrative disputes that cannot be substantially resolved. In the face of off-site law enforcement, judicial decisions should jump out of the original evidence review framework, ensure that evidence is not misidentified in the evidence collection stage, implement the burden of proof of administrative subjects and technical subjects in the evidence collection stage, and adopt different identification standards according to the nature of administrative acts in the cross-examination stage, to balance the efficiency of judicial review and the effective rights and interests of administrative counterparts.

References

Huang ZX, 2021, The Legal Logic of Data Governance. Wuhan University Press, Wuhan, 182.

Steiner C, 2014, Algorithm Empire, Translated by Li Xiaoying. People’s Posts and Telecommunications Publishing House, Beijing, 45–46.

Wang SB, 2021, Algorithm Regulation from the Perspective of Evidence Law. Journal of Shandong Judges Training College, 37(05): 176.

Ma ML, Wang SB, 2021, On the Probative Force Rules of Big Data Evidence. Evidence Science, 29(06): 656–660.

Liu PX, 2019, On Big Data Evidence. Global Law Review, 41(01): 25.

Yuan Y, 2021, The Procedural Regulation of Big Data Evidence from the Perspective of Circular Evolution of Evidence System-taking the Evidence Revealed by God as the Cut-in Point. Forum of Political Science and Law, 39(03): 134–135.

He JH, Deng CZ, Zhang GY, et al., 2018, Challenges Brought by Big Data Investigation to Evidence Law. People’s Procuratorate, 7(01): 56.

Wei CS, 2022, On the Review of Big Data Evidence in Criminal Trial. Journal of Anhui University (Philosophy and Social Sciences Edition), 46(02): 80–83.

Sun Y, 2014, On the Concept of Types of Legal Evidence that is Worthless. Contemporary Law, 28(02): 102–103.

Li XH, 2010, Rules of Proof in American Evidence Law. Comparative Law Research, 2010(04): 95.

Damace MR, 2006, Free Evaluation of Evidence and its Challenges. China People’s Public Security University Press, Beijing, 214–215.

Liu DL, Yan YR, 2023, Correlation and Causality in Big Data Analysis. Journal of National Prosecutor College, 31(02): 32–33.

Yang JW, 2022, Algorithm Evidence: The Algorithm as Evidence and the Prospect of its Applicable Rules. Research on Local Legislation, 7(03): 42.

Lin XF, 2021, On the Application of Big Data Evidence in Criminal Justice. Law Forum, 36(03): 33–36.

Zhang D, 2023, Algorithm Evidence in Criminal Proceedings: Concept, Mechanism, and Application. Journal of Henan University (Social Science Edition), 63(03): 40–41.

Wang MY, 2017, The Formation and Authenticity of Electronic Evidence. Law, 2017(06): 192.

Liu PX, 2021, On the Authenticity Standard of Electronic Evidence. Social Sciences, 2021(01): 78.

Chu FM, 2018, Three Levels of Authenticity of Electronic Evidence: An Analysis based on Criminal Proceedings. Law Research, 40(04): 126–128.

Fink K, 2018, Opening the Government’s Black Boxes: Freedom of Information and Algorithmic Accountability. Information, Communication & Society, 21(10): 1453–1471.

Binns R, 2018, Algorithmic Accountability and Public Reason. Philosophy & Technology, 31(4): 543–556.

Bigman YE, Wilson D, Arnestad MN, et al., 2022, Algorithmic Discrimination Causes Less Moral Outrage than Human Discrimination. Journal of Experimental Psychology, 152(1): 4–27.