|Starting in 2020, Allegheny County, Pa. will attempt to, in effect, stamp EVERY|
child born in the county with a "scarlet number" risk score that could haunt
the child and her or his family for life.
It is perhaps the ultimate Orwellian nightmare: From the moment your child is born, the child and family are labeled with a “risk score” – a number that supposedly tells authorities how likely you are to abuse your newborn. The big government agency that slaps this invisible scarlet number on you and your newborn promises it will be used only to decide if you need extra help to raise your child, and the help will be voluntary.
But once you’re in the database, that score stays there forever. And if, someday, the same big government agency wants to use the score to help decide you’re too much of a risk to be allowed to keep your child, there is nothing to stop them. The scarlet number may haunt your family for generations. The fact that your child was supposedly born into a “high risk” family may be used against the child when s/he has children.
Welcome to the dystopian future of child welfare – and childbirth – in metropolitan Pittsburgh, Pa.
For a couple of years now, Allegheny County, which includes Pittsburgh and surrounding suburbs, has been using something called the Allegheny Family Screening Tool (AFST), a predictive analytics algorithm, to help decide which families should be investigated as alleged child abusers.
|Back when Facebook was fined, we pointed out the similarities|
to how Allegheny County's child protective services agency
The algorithm is weighted heavily toward punishing parents for being poor. In her brilliant book, Automating Inequality, Prof. Virginia Eubanks calls it “poverty profiling.” In her review of Automating Inequality, Prof. Dorothy Roberts (a member of NCCPR’s Board of Directors) extends the analysis to show how predictive analytics reinforces racial bias.
To justify all this, the county submitted its plans to a couple of scholars for an “ethics review.” But one of the reviewers is a faculty colleague and co-author of papers with one of the creators of the algorithm. Even this ethically-challenged ethics review gave a seal of approval to AFST in part based on the premise that it would not be applied to every child at birth.
But getting the chance to slap a scarlet number on every child at birth is the Holy Grail for some predictive analytics proponents. And now it appears that was the goal of the Allegheny County Department of Human Services all along.
The birth of “Hello Baby”
In her book, Eubanks reports that the county was, at a minimum, considering introducing “‘a second predictive model … [that] would be run on a daily or weekly basis on all babies born in Allegheny County the prior day or week,’ according to a September 2017 email” from a deputy director of Allegheny County DHS, Erin Dalton. (Dalton is also disturbingly sanguine about the harm of foster care.) As I noted in a 2018 column for Youth Today, such a model already exists — indeed it’s one of the models the designers of AFST proposed to the county in the first place.
The county apparently turned it down initially because they didn’t think they could sell it politically. But clearly, with a couple of tweaks to the algorithm, now they think they can – and, sadly, they may be right.
And so, starting in January, 2020, the county plans to phase in a “prevention” program it calls “Hello Baby.”
Here’s how the county says it will work.
During some of the most chaotic hours of a family’s life, those hours in the hospital after a baby is born, when one medical professional, volunteer or other hospital-affiliated person after another is traipsing in and out of the room, the family will be handed a packet of information about the help available through “Hello Baby.” A nurse may also discuss the program with the family.
The program offers three tiers of services. Tier 1 is automatically available to everyone without having to surrender their data. That tier is simply information about help that’s already out there. Tiers two and three provide more intensive help to individual families. But to get that help you must accept having the child labeled by an algorithm as at moderate or high risk of abuse.
You have to opt out
The program automatically assumes you have given permission for this massive invasion of family privacy – it’s the equivalent of a “default setting” on an app you may download without realizing how much data you surrender in return. (Or just think of all the data you may have given to Facebook to share at will because you didn’t find the right button among the settings.)
The “Hello Baby” document is vague about the whole opt-out process. But it appears you get very You get one notice – in the form of a postcard mailed to your home a few days after the child is born. Along with a reminder of the benefits of “Hello Baby” somewhere on that postcard will be a notification that you must specifically opt out of being run through the database – otherwise you and your child are slapped with that risk score whether you really wanted to participate or not.
little chance to actually opt out.
little chance to actually opt out.
The material made available by Allegheny County does not mention how much time you have to opt out before your name is run through the database. Nor does it say anything about expunging a risk score if you choose to opt out after the county has already done it.
And what, exactly, are you deemed at risk of doing?
According to the county:
The model was built to stratify families based on the likelihood that there may be future safety issues so significant that the courts require the County to remove the child from the home before the child has reached their 5th birthday.
Think about that. From the moment your child is born, you risk having that child labeled at high risk for being taken away and consigned to foster care. From the moment you say “hello, baby” you may be at greater risk of someday having to say “goodbye, baby.” In effect, “Hello Baby” creates a ticking time bomb in the form of an electronic record that might go off if, say, an angry neighbor calls a child abuse hotline, or if you’re caught pot smoking while Black.
To avoid that risk you have to be alert to the chance to opt out, and if you opt out you risk losing out on what might be genuinely useful assistance.
We’ll never, ever misuse all that data we have on you – we promise!
County officials solemnly promise not to use the data that way – they say they’ll use it only to target help, and won’t make it a part of child abuse investigations. But even the promise has a loophole:
As the county’s “Hello Baby” overview puts it:
The County pledges that this Hello Baby analytic model will only be used to provide voluntary supportive services as described here and updated over time. [Emphasis added.]
Indeed, they will issue a signed document to that effect. What could possibly go wrong?
I think Allegheny County really means it when they say they won’t pull away the football – sorry, misuse the algorithm – for now. But there is no institutional safeguard in place. There is nothing to stop the leaders of the agency that created “Hello Baby” and crave having data on every child from birth from changing their minds whenever they damn well feel like it.
When might that be? How about the first time there’s a child abuse tragedy and word leaks out that the family had been labeled “high risk” at the time of the child’s birth? That’s when the demands will come to make this information available immediately to child protective services and to use it to immediately trigger a CPS investigation – or worse.
That’s not the only problem. The extra help families will get is likely to be provided by people who are “mandated reporters” of alleged child abuse and neglect. There are penalties for failing to report and no penalty for mistakenly calling in a false report. So mandated reporters always are under pressure to make “CYA” referrals. Now, these mandated reporters will enter the home already knowing that a “scientific” algorithm has determined the family is “high risk” for abusing and/or neglecting their child. That’s bound to color the judgment of the helpers when deciding whether or not to phone in a report alleging child abuse or neglect.
It’s still poverty profiling
In order to counter the charge of poverty profiling, the county has tweaked the algorithm – slightly. But their claims are disingenuous at best. Thus, they claim: “Unlike the Allegheny Family Screening Tool model, the Hello Baby model only relies on data where the County has the potential to have records for every family it only uses universal (rather than means tested) data sources.”
But the key weasel word there is potential.
Because right before making this claim, the county acknowledges that they probably will use “child protective services, homeless services and justice system data.”
And, of course, they include data from any previous encounters with child protective services – and CPS intervenes to a vastly disproportionate degree in the lives of poor people. (As noted in many previous posts, CPS agencies often confuse poverty with neglect. So if you use a previous “substantiated” allegation of child neglect to raise a risk score you are not countering bias, you are simply automating it.)
And, of course, both the justice system and the child welfare system are notorious for their racial bias – raising the risk that “Hello Baby” amounts to racial profiling as well.
Another ethically-challenged ethics review
As noted earlier, even the “ethics review” for AFST commissioned by the county itself – the one co-authored by a faculty colleague of one of the designers of AFST – emphasized that one reason AFST was ethical is that it was not triggered until someone actually phoned in a call alleging child abuse and neglect. It was deemed ethical in part precisely because it did not seek to slap a risk score onto every child at birth.
How do you get around this little detail? Simple. Commission another ethics review from someone who is likely to tell you what you want to hear.
So Allegheny County turned to Deborah Daro. Like most people in child welfare, Daro really wants to help children, and she’s devoted her life to the cause. But Daro spent much of her time at the group that now calls itself Prevent Child Abuse America – and she did so at a time when PCAA was fomenting hype and hysteria about child abuse, and taking data out of context. They were particularly keen on minimizing the role of poverty in what we label abuse and neglect. I discuss this in detail in the section of this 2010 blog post called “PCAA’s record of extremism.” But don’t take my word for it – back in 2003, PCAA came startlingly close to admitting as much, declaring:
While the establishment of a certain degree of public horror relative to the issue of child abuse and neglect was probably necessary in the early years to create public awareness of the issue, the resulting conceptual model adopted by the public has almost certainly become one of the largest barriers to advancing the issue further in terms of individual behavior change, societal solutions and policy priorities.
Then Daro moved to the Chapin Hall at the University of Chicago. The same 2010 blog post documents Chapin Hall’s bias, and some of Daro’s work there.
More recently, Chapin Hall has been a leader in minimizing the role of racial bias in child welfare, and in fueling foster-care panic in Illinois.
And nearly a decade ago, Daro herself wrote a paper advocating for something very much like “Hello Baby.” She called for:
Universal assessments of all new parents that carry the dual mission of assessing parental capacity to provide for a child's safety, and linking families with services commensurate with their needs.
So, in effect, Allegheny County asked Deborah Daro to offer an opinion as to whether using an algorithm for the kind of intervention she herself has been promoting for decades is ethical. Apparently, she said yes.
Neither is a second ethics review done by Michael Veale a “Digital Center Fellow” at the Alan Turing Institute in London. In fairness, I am aware of no biases on Veale’s part concerning child welfare. But his biography reveals no knowledge of or experience in the field. So he was at the mercy of those who commissioned him to understand how child protective services agencies really work.
An intellectually honest ethics review would require bringing together a panel of experts who have strongly divergent views on child welfare and predictive analytics and seeing if they could formulate an ethical framework for using such an algorithm in child welfare. But of course if Allegheny County tried that they would risk getting answers they don’t want to hear.
You don’t need an
algorithm to target help
A crucial false premise behind efforts such as “Hello Baby” goes like this: Funds are limited, so we need this kind of algorithm to target help to the families who need it most. But no such algorithm is necessary. That’s because the families that need the most help have one thing in common: They’re poor. So all you have to do is offer the high-end “Hello Baby” services to families of infants born in hospitals that serve the county’s poorest communities. And, while you’re at it, make sure the help addresses concrete needs of poor families instead of just forcing them to run a gauntlet of counseling sessions and parent education classes.
The “Hello Baby” overview paper claims this won’t work because it’s “based on the incorrect assumption that poverty is the singular driver for abuse.” But that is setting up a straw man. No one says poverty is the singular driver for abuse. But poverty is, by far, the most important driver of what we deem to be abuse and, especially, neglect.
The “Hello Baby” document goes on to claim that other causes are “untreated mental illness, substance use disorder and intimate partner violence.” But if you’re middle class your mental illness probably won’t go untreated – because you have the money to treat it. Your substance use won’t be deemed a disorder, because middle-class parents can use substances pretty much with impunity. And an algorithm that checks criminal justice and homelessness records to determine risk isn’t likely to catch wealthy drug users, now is it?)
Most important, there is now a wealth of research documenting the simple fact that what we deem to be child maltreatment can be fixed primarily by transferring just a little more wealth to poor people.
So why do we need a giant Orwellian child welfare surveillance state to “help” these families? We don’t. We only need it to target them, control them, and quite possibly, take away their children.