Sunday, March 24, 2019

Does child welfare have a “Max 8” problem?



A story on NPR Friday about the crashes of two Boeing 737 Max 8 planes begins this way:

The investigation … points, for the moment, at software. There’s evidence that a program called MCAS pointed the planes’ noses down repeatedly without pilots even knowing, ultimately leading to the deaths of hundreds of people.

The story then discusses our increasing reliance and, perhaps, overreliance on software without even knowing how the software works.

The story concludes with a warning from Prof. Eben Moglen of Columbia University School of Law, described by NPR as an advocate for “software transparency.”  Says Prof. Moglen:

Even if you didn’t think this had anything to do with politics, it is still true that systems that don’t explain themselves to the human beings who interact with them are dangerous.

Child welfare, of course, has everything to do with politics.  And the latest fad in child welfare is “predictive analytics,” in which child welfare agencies use software that’s supposedly able to predict who is likely to abuse a child.  The many problems with this are summarized here.  But one of those many problems is the fact that the software is – secret.

In most cases, the algorithms are created by for-profit companies; so, obviously, they don’t want us to know what goes into them.  But even the wildly overhyped model in Pittsburgh, where they’re constantly bragging about transparency, isn’t nearly transparent enough.

In Pittsburgh, they’ll gladly tell you the list of items the algorithm considers.  That means at least we can see how often they are actually measuring nothing more than the likelihood of being poor – what Prof. Virginia Eubanks, author of Automating Inequality, calls “poverty profiling.” But they won’t say how the various items are weighted. 

This would be bit like a hypothetical executive for a hypothetical aircraft manufacturer telling a pilot: “So, Captain, at certain times the software on this new plane will affect the position of the nose – but we won’t tell you when it kicks in, and we won’t tell you if it pushes the nose down or pulls it up.”

Whatever one may think of what Boeing did or failed to do, no one is suggesting they did anything like that. It would be preposterous.

So why isn’t it considered preposterous in child welfare? 

It should be. Because systems that don’t explain themselves to the human beings who interact with them are dangerous.