Longterm Wiki
Back

Data Status

Not fetched

Cited by 1 page

Cached Content Preview

HTTP 200Fetched Feb 22, 202610 KB
Bias in Criminal Risk Scores Is Mathematically Inevitable, Researchers Say — ProPublica 
 
 
 
 
 
 

 

 

 

 

 

 

 

 

 

 
 
 
 

 

 
 
 
 
 
 
 
 
 
 
 
 
 

 
 
 
 
 
 
 
 
 
 
 
 
 

 
 
 
 
 
 Arrow Right Caret Close 
 

 
 

 
 
 
 
 
 Series: 
 
 Machine Bias: 
 Investigating Algorithmic Injustice 
 

 
 
 
 More in this series 
 

 

 The racial bias that ProPublica found in a formula used by courts and parole boards to forecast future criminal behavior arises inevitably from the test’s design, according to new research.

 The findings were described in scholarly papers published or circulated over the past several months. Taken together, they represent the most far-reaching critique to date of the fairness of algorithms that seek to provide an objective measure of the likelihood a defendant will commit further crimes.

 Increasingly, criminal justice officials are using similar risk prediction equations to inform their decisions about bail, sentencing and early release.

 The researchers found that the formula, and others like it, have been written in a way that guarantees black defendants will be inaccurately identified as future criminals more often than their white counterparts.

 The studies, by four groups of scholars working independently, suggests the possibility that the widely used algorithms could be revised to reduce the number of blacks who were unfairly categorized without sacrificing the ability to predict future crimes.

 The author of one of the papers said that her ongoing research suggests that this result could be achieved through a modest change in the working of the formula ProPublica studied, which is known as COMPAS.

 An article published earlier this year by ProPublica focused attention on possible racial biases in the COMPAS algorithm. We collected the COMPAS scores for more than 10,000 people arrested for crimes in Florida’s Broward’s County and checked to see how many were charged with further crimes within two years.

 
 Machine Bias 

 There’s software used across the country to predict future criminals. And it’s biased against blacks. Read the story. 

 
 When we looked at the people who did not go on to be arrested for new crimes but were dubbed higher risk by the formula, we found a racial disparity. The data showed that black defendants were twice as likely to be incorrectly labeled as higher risk than white defendants. Conversely, white defendants labeled low risk were far more likely to end up being charged with new offenses than blacks with comparably low COMPAS risk scores.

 Northpointe, the company that sells COMPAS, said in response that the test was racially neutral. To support that assertion, company officials pointed to another of our findings, which was that the rate of accuracy for COMPAS scores — about 60 percent — was the same for black and white defendants. The company said it had devised the algorithm to achieve this goal. A test that is correct in equal proportions for all 

... (truncated, 10 KB total)
Resource ID: 3bd4b29e4c338882 | Stable ID: MDg0YjFmMT