|Allegheny County's use of predictive analytics in child welfare|
generates the equivalent of a "scarlet number" that can mark
a child, and even that child's children.
Imagine you’re on a baseball team. The game is about to start when you find out that the home plate umpire moonlights as a volunteer coach for the other team. He even co-authored a handbook with the manager of the other team.
That umpire still may be perfectly capable of objectively calling balls and strikes. But would you be comfortable relying on him to make those calls?
Anyone who would say no should be deeply uncomfortable about one of the key elements in how Allegheny County (Pittsburgh) Pa. sold its use of “predictive analytics” to decide which families are investigated as alleged child abusers – and possibly, in the near future, to decide much more.
The Allegheny County model is known as the Allegheny Family Screening Tool (AFST). In effect, it stamps children who are the subject of reports alleging child maltreatment with an invisible “scarlet number” that supposedly measures how likely they are to be abused or neglected. I discussed the dangers of AFST in this column for Youth Today. Here I’d like to focus on one element that has been crucial to selling AFST. Over and over stories proclaiming how wonderful it is cite a so-called “ethics review.” An op-ed in the Pittsburgh Post Gazette declares:
The ethical assessment determined that the tool was so much more accurate than relying solely on human analysis that declining to use it would be unethical.
Similarly, a New York Times Magazine story, which I’ve discussed before on this blog, declares:
Marc Cherna, who as director of Allegheny County’s Department of Human Services … had an independent ethics review conducted of the predictive-analytics program before it began. It concluded not only that implementing the program was ethical, but also that not using it might be unethical.
But “independent” is a stretch.
The review was conducted by Prof. Eileen Gambrill of the University of California – Berkeley and and Prof. Tim Dare of the University of Auckland in New Zealand. New Zealand? Seems like a long way to go to find an ethics reviewer. Unless, of course, his selection has something to do with this: One of the designers of the predictive analytics model used in Allegheny County is Prof. Rhema Vaithianathan of the University of Auckland in New Zealand.
And these two don’t just pass each other in the hallway. Much like that hypothetical manager and umpire, Prof. Dare and Prof. Vaithianathan co-authored papers.
That doesn’t mean Prof. Dare can’t be objective. But if Allegheny County is going to commission an ethics review and tout it as independent, surely among all the world’s academicians the county could have found two with no ties to the authors of the model being reviewed, thereby ensuring there would not be even the appearance of conflict-of-interest.
Unless, of course, the county was afraid that such a review wouldn’t tell county officials what they wanted to hear.
That’s probably why, while looking for scholars in New Zealand, they decided not to ask Prof. Emily Keddell of the University of Otago. She’s already done a review of some of Prof. Vaithianathan’s work – and it’s not nearly as favorable as the one written by Vaithianathan’s co-author.
As for the review itself, it’s a startlingly superficial document, a nine-page once-over-lightly that starts on page 44 of this document. Citations are few and far between – and they are limited to papers written by either the designers of AFST or Prof. Dare himself.
A key caveat
But even so, as I noted in the Youth Today column, the review includes one important caveat:
AFST is said to be ethical in part because it is used only after a call to a child protective services hotline alleging abuse and neglect. The review specifically questioned whether AFST would be ethical if applied to all children at birth. As the review states:
[The issue of informed consent] is one of a number of points at which we think that it is ethically significant that the AFST will provide risk assessment in response to a call to the call center, rather than at the birth of every child. In the latter case there is no independent reason to think there are grounds to override default assumptions around consent. The fact there has been a call, however, provides at least some grounds to think that further inquiry is warranted in a particular case. [Emphasis added.]
But that might not be the case for long. As I noted in my column for Youth Today, in her brilliant book, Automating Inequality, Prof. Virginia Eubanks reports
that the county is, at a minimum, considering introducing “’a second predictive model …[that] would be run on a daily or weekly basis on all babies born in Allegheny County the prior day or week,’” according to a September 2017 email from [Cherna’s deputy, Erin] Dalton.” Such a model already exists – indeed it’s one of the models the designers of AFST proposed to the county in the first place.
So if the county starts branding every infant with a scarlet number at birth, a number that will even affect the number assigned to their children and grandchildren, is that model inherently unethical?
I’ve known Marc Cherna for more than 20 years. When I lived in Allegheny County I served on a screening committee that unanimously recommended him, and one other candidate, as finalists for the job of running the county child welfare system. We were right. He has an outstanding record for safely keeping families together.
I’ve included Pittsburgh on NCCPR’s list of ways to do child welfare right, and referred journalists from everywhere from CNN, in 2002, to the Arizona Daily Star, just this year, to Pittsburgh to examine the county’s success. (Cherna still regularly quotes what I told CNN all those years ago.) When Cherna says he wants to use analytics in the right ways for the right reasons, I believe him.
But that’s not enough. And touting a stacked-deck ethics review is particularly disappointing. Again, as I wrote in Youth Today:
Cherna promises that the scarlet numbers under any such system will be used only to find the families most in need of help. … But what about Cherna’s successor, and his successor’s successor? Any system that depends for success on the benevolence of a single leader with near absolute power is too dangerous for a free society.
NCCPR’s full analysis of the role of predictive analytics in child welfare is available here.