quotes a warning issued in 2014 by then-Attorney General Eric Holder to the U.S. Sentencing Commission. His warning concerned a fad spreading through the criminal justice system. Said Holder:
The fad that so concerned Holder is, of course, predictive analytics; the same fad spreading through child welfare.
Now, ProPublica has found that Holder was right.
ProPublica looked at 7,000 cases in Broward County, Fla., which uses a secret algorithm created by a for-profit company to assign risk scores to people arrested in that county, much as Los Angeles County plans to use a secret algorithm from a for-profit company in its child abuse investigations.
According to the story, when it came to predicting violent crime, the algorithm did a lousy job in general – four times out of five, people the algorithm said would commit a violent crime within two years did not.
In addition, according to the story:
The company that came up with the algorithm disputes the findings, saying its own analysis of the data found no racial disparities.
Since the algorithm itself is secret, we can’t be sure why the results came out racially biased.
But Prof. Sonja Starr of the University of Michigan Law School that the factors used to create these sorts of algorithms typically include “unemployment, marital status, age, education, finances, neighborhood, and family background, including family members’ criminal history.”
Similarly, the algorithm for child abuse investigations includes risk factors such as whether the child has been taken often to an emergency room or whether the child often changes schools, both factors closely correlated with poverty. Perhaps that helps explain why, when tested, the Los Angeles model apparently produced false positives a staggering 95 percent of the time.
There is a similar problem when it comes to the use of “criminal history.”
The same, of course, is true when it comes to “reports” alleging child abuse – some communities are much more heavily “policed” by child protective services. If anything, broad, vague definitions of “neglect” that equate neglect with poverty itself make the problem even worse in child welfare. And, of course, the problem is compounded when those most loudly beating the drum for predictive analytics what such reports really mean.
The parallels to child welfare don’t end there.
§ In criminal justice, the use of predictive analytics is far outrunning objective evaluation. ProPublica found that evaluations were rare and often done by the people who developed the software. ProPublica had to do its own test for racial bias because, it seems, no one else has bothered.
§ Predictive analytics originally was sold in criminal justice as a benevolent intervention – meant to help agencies custom tailor rehabilitation and supportive services to the needs of high-risk defendants and reduce incarceration.
But it’s quickly metastasizing into use at all stages of the criminal justice projects, including, most ominously, sentencing.
So just as predictive analytics puts black defendants at greater risk of prolonged sentences, predictive analytics in child welfare puts black children at greater risk of being sentenced to needless foster care – with all of the attendant harms in terms of and other .
But wouldn’t I consider it OK to just use predictive analytics for “prevention”? asks publisher of the Chronicle of Social Change, (the Fox News of child welfare). The criminal justice experience makes clear that can’t be done, and there is no need. Instead of targeting individuals, you can simply bring genuine, voluntary help to poor neighborhoods, giving you plenty of bang for limited bucks, while limiting the risk of what amounts to computerized racial profiling.