Momus Analytics accused of using biased juror selection algorithms
Momus Analytics accused of using biased juror selection algorithms
Occurred: March 2020
Report incident ๐ฅ | Improve page ๐ | Access database ๐ข
A US company that provides automated jury selection services has been accused of using algorithms that produce biased results in juror selection.ย
Momus Analyics uses big data and machine learning to help attorneys identify and rank the best and worst potential jurors for their cases, and claimed its software did not use race, sex, religion, or country of origin as factors in determining juror ratings.ย
However, an application for a patent revealed that Momus Analytics' algorithm considered race, education level, and political affiliation to determine a juror's "leadership qualities" and biases toward personal or social responsibility.ย
Critics argued that using such demographic information has no reliable correlation to juror disposition, may violate constitutional prohibitions against excluding jurors based on race or sex, and potentially compromise the fairness of trials.
Momus Analytics was also taken to task for failing to provide details on the data used to develop its juror ranking system and how its machine learning model was trained.
Jury selection in the United States is the choosing of members of grand juries and petit juries for the purpose of conducting trial by jury in the United States.ย
Source: Wikipedia ๐
Momus Analytics ๐
Operator:
Developer: Momus Analytics
Country: USA
Sector: Govt - justice
Purpose: Predict juror behaviour
Technology: Machine learning
Issue: Accuracy/reliability; Bias/discrimination; Human/civil rights
Page info
Type: Issue
Published: August 2024