Monday, February 18, 2019

Predictive analytics in Pittsburgh child welfare: No poverty, no profile?

Pittsburgh's predictive analytics algorithm labels parents - and children - with a
 "risk score" that amounts to an invisible "scarlet number" that may haunt them for life.

Pittsburgh is the home of the nation’s most advanced use of “predictive analytics” in child welfare.  In Pittsburgh, an algorithm is used to screen calls alleging child abuse and determine if a case is high risk.  The algorithm produces a “risk score” – a secret “scarlet number” between 1 and 20. The higher the number the greater the supposed risk.

So now let us consider a recent incident in Pittsburgh – and what the algorithm might do.

A man storms into an office.  His 12-year-old daughter is with him. The man appears intoxicated.   The man screams and yells and pounds his fists against a bulletin board. He demands to have his picture taken.

He forcefully grabs his daughter’s forearm, pulling her into the picture as she tries her best to pull away from him.  She screams “Please, please Daddy, no!” multiple times. And multiple times he yanks on her arm, trying to pull her to his side so a photo could be taken of both of them.  He yells at his daughter and repeatedly jabs his finger in her shoulder.

The daughter is crying hysterically and red-faced.  The father rips a cell phone out of her hand because he thought she was trying to call her mother.

As one eyewitness said:

I was extremely concerned for his daughter‘s safety, and I actually noticed that my heart was racing. …  Having to watch as [the father] terrorized his teenage daughter — with his hair disheveled and his face twisted — was something I’m never going to forget.

What would AFST do?


I don’t know if anyone called Pennsylvania’s child abuse hotline to report the incident.  But if anyone did, and if the call were then referred to Allegheny County Children and Youth Services (CYS), the name of the father would be run through the Allegheny Family Screening Tool (AFST), the county’s vastly overhyped predictive analytics algorithm.  And the odds are that this father’s risk score would be very, very – low.

Why?  Because the father in this case is John Robinson Block, publisher of the Pittsburgh Post-Gazette.  The alleged outburst occurred in the Post-Gazette newsroom.  The account above is taken directly from eyewitness accounts posted on the website of the Newspaper Guild of Pittsburgh.

Block Communications disputes these accounts. According to The New York Times:

In a statement on Thursday, Block Communications disputed the employees’ accounts, saying that Mr. Block had only “expressed his frustration” to employees “about several issues of concern to him.” The company said it provided a safe work environment. 
“We have conducted a review of all information available, and we disagree with the characterization of Saturday evening’s events as expressed by the Newspaper Guild,” the statement said. Mr. Block “expresses his sincere regrets over his conduct that evening and did not intend his actions to upset anyone,” it added.

But, of course, child protective services agencies urge people to report even their slightest suspicions that something they’ve seen might be child abuse or neglect. 

Were John Robinson Block reported, AFST would not cough up a message that says: “Hey, this guy’s a bigshot publisher, better leave him alone!” but as a practical matter, the algorithm may have a similar effect.


As Prof. Virginia Eubanks explains in her book Automating Inequality, 25 percent of the variables in AFST are direct measures of poverty. Another 25 percent measure interaction with the child welfare and juvenile justice systems themselves.

As Eubanks explains:

Because the model confuses parenting while poor with poor parenting, the AFST views parents who reach out to public programs as risks to their children.

Because these are public benefits, such as SNAP (formerly foodstamps), TANF (Temporary Assistance for Needy Families) and Medicaid, the data are collected automatically by the county.

But odds are John Robinson Block has never applied for any of these programs, so his risk score is likely to be lower.

And if John Robinson Block has ever had any personal problems that might bring him to the attention of, say, health professionals, he would have been able to get the best private care – so nothing is going to go into a public database that could be scoured by AFST and further raise the risk score. 

Eubanks calls AFST “poverty profiling.”  But is there a corollary: No poverty, no profile?

In her book, Eubanks documents impoverished families caught in the Allegheny County CYS net based on allegations far less serious than what those eyewitnesses say occurred in the Post-Gazette newsroom.

It’s possible that the whole incident wouldn’t be in Allegheny County’s jurisdiction anyway. I don’t know if Block even lives in the Pittsburgh area. The family media company is headquartered in Toledo, Ohio.

But if the allegations ever do reach screeners in Allegheny County, workers may be too busy checking out “high risk” families whose poverty has been confused with “neglect” to take them seriously.

On one level that might be good for Block’s daughter.  As I noted in this column for Youth Today:

The algorithm visits the sins of the parents, real or imagined, upon the children. Eubanks cites a family that was victimized by repeated false reports. When the child the county supposedly was “protecting” grows up, if she herself becomes the target of a child abuse report she will bear a higher scarlet number — because her own parents supposedly were neglectful. So her children will be at greater risk of enduring the enormous harm of needless foster care.

In contrast, if the parent’s “scarlet number” is low, the child’s will be as well.

No this does not mean we should do more spying on rich people


I can hear America’s latter-day “child savers” now: “This just proves we need even more spying!” they’ll say.  “Make butlers and chauffeurs mandatory reporters of child abuse!” In fact, Erin Dalton, deputy director of the Allegheny County Department of Human Services, has already said something like that, telling Eubanks: “We really hope to get private insurance data. We’d love to have it.”  

As I’ve noted before, Dalton also is the one who sent an email talking about stamping a scarlet number on every child born in the county – at birth. She’s also gone out of her way to  minimize the harm of foster care.

The child savers also will look at a case such as this and say: “Just because we can’t find out nearly as much about rich people doesn’t mean we should let poor people get away with child abuse!”

That, of course, misses the point. The fact that, if the eyewitness accounts are correct, and if he were reported to Allegheny County CYS,  John Robinson Block probably would wind up with a low AFST risk score does more than illustrate what doesn’t get noticed. It also illustrates what does get noticed.  It illustrates that AFST doesn’t predict child abuse – it predicts poverty, and then confuses that poverty with neglect.