Violent crime prediction algorithms are racially biased

A new study shows that software-driven risk assessment tools are hardly blind to race.

Sponsored Links


When a criminal defendant faces sentencing in the United States, a judge can use several factors to determine a punishment that fits the crime. Increasingly, one of those factors is what is known as a "risk assessment score" -- a number meant to predict whether or not the defendant will commit another crime in the future. According to a new report from ProPublica, however, the algorithms driving those scores are biased against African Americans.

The risk scores can influence everything from bail amounts to treatment plans or jail time. If a defendant has a higher risk of recidivism, the thinking goes, then they should receive a sentence that acts as a disincentive for committing some future crime. It was this sort of thinking that led U.S. Attorney General Eric Holder to warn in 2014 that these scores could "exacerbate unwarranted and unjust disparities that are already far too common in our criminal justice system and in our society."

To test Holder's hypothesis, ProPublica look at the data from over 7,000 defendants in Broward County, Florida whose risk scores were generated by one of the most popular assessment tools in the country designed by a company called Northpointe.

And ProPublica's study found that the scores were way off base when it came to predicting violent crime. "Only about 20 percent of those people predicted to commit violent crimes actually went on to do so," the ProPublica team writes. Even when accounting for all types of crimes — including misdemeanors and moving violations -- the algorithm was only "somewhat more accurate than a coin flip" at determining whether or not someone would commit a second crime.

Turn on browser notifications to receive breaking news alerts from Engadget
You can disable notifications at any time in your settings menu.
Not now

What's more alarming, ProPublica was able to confirm Holder's concern that the algorithm's sense of justice was far from blind, especially when it came to race. From the report:

  • The formula was particularly likely to falsely flag black defendants as future criminals, wrongly labeling them this way at almost twice the rate as white defendants.
  • White defendants were mislabeled as low risk more often than black defendants.

Northpointe disputes the report's findings and points out that race is not an explicit factor in its assessment algorithm. However, some of the factors that do inform the scores can be closely tied to race, like the defendant's education level, employment status and social circumstances such as family criminal history or whether or not their friends take illegal drugs. And the specific calculations necessary to arrive at the final score are proprietary -- meaning defendants and the general public have no way to see what might be influencing a harsh sentence.

While algorithms like these might be well-intentioned, the system's opacity is already seen as a problem. In Chicago, for example, police have had surprising accuracy using an algorithm to predict who will commit or be the target of gun violence, but members of the ACLU find it troubling that members of the community can be singled out as criminals without any insight into what landed them on the CPD's list.

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.
Popular on Engadget