● Apparently, a 95
percent false positive rate was considered a tad high
● Head of county’s
Office of Child Protection urges slow, cautious approach to any use of
predictive analytics
In Los Angeles County, they called it Project AURA (Approach
to Understanding Risk Assessment).
It was among the most highly-touted experiments in the
burgeoning fad for using predictive
analytics in child welfare – that dystopian sci-fi nightmare-come-true in
which computer algorithms predict who will abuse a child (but, we are assured,
child protective services agencies would never ever actually use that information to tear apart families).
Project AURA was the subject of gushy
news stories, and even
gushier stories promoting the gushy news stories. It was an experiment particularly beloved by
those who are most extreme in their desire to see more children taken from
their parents.
And now, thankfully, it is reportedly dead.
Buried on page 10 of a
report to the Los Angeles County Board of supervisors by Michael Nash,
executive director of the county’s Office of Child Protection, is word that the
county Department of Children and Family Services (DCFS) “is no longer pursuing
Project AURA.”
AURA stood for Approach to Understanding Risk Assessment. It
was developed by software firm SAS. Exactly what’s in it is a secret. No one
outside SAS knows exactly how the algorithm works.
AURA was never used on any actual cases. Rather it was
tested on past reports alleging child abuse or neglect. Then SAS looked to see
what actually happened to those families.
As Nash’s report revealing the death of Project AURA
explains:
While the tool correctly detected a high number of children (171 cases) at the highest risk forabuse, it also incorrectly identified an extremely high number (3,829 cases) of falsepositives (i.e., children who received high risk scores who were not at risk for a negative outcome). [Emphasis added.]
In other words, AURA identified a staggering number of
innocent families. Had AURA actually been in use, an astounding number of children
would have been placed at risk of needlessly being torn from their homes and
consigned to the chaos of foster care.
What finally killed
AURA?
The results of the AURA experiment – including the false
positive rate - have been known for
nearly two years. But that didn’t stop the county from pushing ahead – and it
didn’t stop the gushy news coverage. It’s not clear what finally prompted DCFS
to pull the plug.
Perhaps it’s because, as Nash points out, all those false
positives would further overload the system. More likely, it was an initiative by the
State of California to try to come up with a “better” predictive analytics
model.
Unlike AURA, developers of the new model are promising a
completely open process, including consultation with various “stakeholders” and
transparency about exactly what risk factors are used and how they are weighed
- allowing anyone to “interrogate the algorithm.”
Also encouraging, Nash’s report, commissioned by the Supervisors
themselves, is filled with warnings about the need to proceed “cautiously and
responsibly.” He says a set of strict standards “to address the important
operational legal and ethical considerations…” should be adopted “before considering
the use of predictive-analytics models.”
Those standards should include “understanding how racism and other
biases may be embedded in systemic data and addressing these within the model.”
Nash even noted that the independent journalism nonprofit ProPublica found
exactly that bias in predictive analytics tools already in use in criminal
justice.
All this means that, if nothing else, the nightmare of “Minority
Report”- style policing in Los Angeles child welfare is at least another year
or two away.
The bad news is that Nash’s report accepts the naïve view
that once a good algorithm is created
it can be properly controlled and limited.
He writes:
Determining [predictive analytics’] “right” use – to identify families most in need of supports, rather than to trigger any negative consequences for them – will be fundamental.
But Nash, himself a former juvenile court judge, must know that’s
now how child welfare works in the real world.
Whatever controls are in place at the outset will
disappear the moment a child “known to the system” dies and the caseworker
handling the case says “DCFS had all this information about the family, and
they knew it was ‘high risk’ but they didn’t tell me.”
Philip Browning |
Then, all bets - and all restrictions - are off, and it will
be take-the-child-and-run in every family where the computer spits out a high "risk score."
One more bit of bad news: One of the strongest boosters of predictive
analytics in Los Angeles, former DCFS director Philip Browning, has been hired
as a consultant to “help” New York City’s child welfare agency.
SDM is let off the hook
The other bad news concerns the other model of risk and
safety assessment that the Supervisors asked Nash to study – the one currently
used in Los Angeles - Structured Decision-Making.
Like predictive analytics, SDM also has been found to raise issues
of racial
and class bias. Nash acknowledges those issues in passing:
Users of the tool, in particular, fault it for not incorporating into its assessments theentire story of what is happening within a family, but instead focusing on a few broadstrokes without giving weight to important nuances. Users additionally state that the toolis too narrowly focused on the caregiver and does not take into account the strengths ofthe family as a whole.
But immediately he adds this parenthetical aside:
(The latest version of SDM has been revised to try to be more strength-based in its approach.)
But in my own experience, some version of “Yes, but the new version is different” is what
developers of SDM have said for more than a decade, each time similar concerns
are raised. That can only leave one
wondering about all the “risk assessments” and “safety assessments” performed
with old, unimproved versions of SDM.
The defeat of AURA shows that, contrary to what some
predictive analytics proponents say in their worst moments of hubris, it is not
inevitable that every legislative body and child welfare agency will embrace this latest fad in child welfare.
At a minimum, opponents in Los Angeles have more time to
organize. And using predictive analytics in child welfare no longer has an AURA
of inevitability.