AI and Bias in Legal Decisions ay-eye and BY-uhs in LEE-guhl di-SIZH-uhnz The potential for artificial intelligence (AI) systems used in legal contexts to make decisions that are unfairly prejudiced against certain individuals or groups due to inherent biases in the data or algorithms. The use of AI in legal decisions raises concerns about potential biases perpetuating existing inequalities in the justice system. In 2016, ProPublica analyzed a risk assessment tool used in criminal sentencing and found that it was biased against Black defendants, leading to calls for greater scrutiny of AI in legal settings. ← Back to BrowseNext Term →