Monday, April 25, 2016

What would a child welfare predictive analytics algorithm say about this?

Let’s take a trip into the near future. Just a couple of years.

Child Protective Services has just received a child maltreatment report concerning a father of five. With a few keystrokes, CPS workers find out the following about him:
He’s married, but the family lives in deep poverty. He has a criminal record, a misdemeanor conviction. He and his wife also had the children taken away from them; they were returned after six months.
These data immediately are entered into a computer programmed with the latest predictive analytics software. And quicker than you can say “danger, Will Robinson!” the computer warns CPS that this guy is high-risk.
When the caseworker gets to the home, she knows the risk score is high, so if she leaves those children at home and something goes wrong, she’ll have even more than usual to answer for.
No matter what the actual report – since in this new modern age of “pre-crime” making determinations based on what actually may have happened is so passe – those children are likely to be taken away, again.

So, now let’s return to the present and meet the family at the center of the actual case on which this hypothetical is based: (If the video below doesn't play properly follow this link instead.)
In the hypothetical, I changed two things about this story. First, the story mentions no criminal charges, and, in fact, panhandling is illegal only in some parts of Houston. But predictive analytics tends not to factor in Anatole France’s famous observation that “the law, in its majestic equality, forbids rich and poor alike to sleep under bridges, to beg in the streets, and to steal bread.”
 So had there been a criminal conviction, or even a charge, it almost certainly would have added to the risk score.
And second, I’m assuming Dennison and his wife actually will get their children back. In fact, there’s no telling what will happen, and the family is under the impression that CPS is pursuing termination of parental rights.
What we do know is that in the brave new world of predictive analytics, if Dennison’s children ever are returned, and if Dennison ever is reported again, the children are likely to be removed again. And, what with it then being the second time and all, they’re more likely to stay removed forever.
For now, the parents don’t know where their children are. But given that this is Texas foster care we’re talking about, odds are it’s nowhere good.

I can hear the predictive analytics evangelists now: “You don’t understand,” they’ll say. “We would just use this tool to help people like Dennison and his family.”
And yes, there are a very few enlightened child protective services agencies that would do that. But when Houston CPS encountered the Dennison family that’s not what they did. They did not offer emergency cash assistance. They did not offer assistance to Dennison to find another job, or train for a new one.
They took the children and ran. Just as Houston CPS did in another case, where they rushed to confuse poverty with neglect.


An algorithm won’t make these decisions any better.  They’ll just make it easier to take the child and run.
For much more about the harm of predictive analytics in child welfare, see our publication, Big Data is Watching You