Tuesday, May 9, 2017

Los Angeles County quietly drops its first child welfare predictive analytics experiment

● Apparently, a 95 percent false positive rate was considered a tad high

● Head of county’s Office of Child Protection urges slow, cautious approach to any use of predictive analytics


In Los Angeles County, they called it Project AURA (Approach to Understanding Risk Assessment).

It was among the most highly-touted experiments in the burgeoning fad for using predictive analytics in child welfare – that dystopian sci-fi nightmare-come-true in which computer algorithms predict who will abuse a child (but, we are assured, child protective services agencies would never ever actually use that information to tear apart families).

Project AURA was the subject of gushy news stories, and even gushier stories promoting the gushy news stories.  It was an experiment particularly beloved by those who are most extreme in their desire to see more children taken from their parents.

And now, thankfully, it is reportedly dead.

Buried on page 10 of a report to the Los Angeles County Board of supervisors by Michael Nash, executive director of the county’s Office of Child Protection, is word that the county Department of Children and Family Services (DCFS) “is no longer pursuing Project AURA.”

AURA stood for Approach to Understanding Risk Assessment. It was developed by software firm SAS.  Exactly what’s in it is a secret. No one outside SAS knows exactly how the algorithm works.

AURA was never used on any actual cases. Rather it was tested on past reports alleging child abuse or neglect. Then SAS looked to see what actually happened to those families.

As Nash’s report revealing the death of Project AURA explains:

While the tool correctly detected a high number of children (171 cases) at the highest risk forabuse, it also incorrectly identified an extremely high number (3,829 cases) of falsepositives (i.e., children who received high risk scores who were not at risk for a negative outcome). [Emphasis added.]

In other words, AURA identified a staggering number of innocent families. Had AURA actually been in use, an astounding number of children would have been placed at risk of needlessly being torn from their homes and consigned to the chaos of foster care.

 What finally killed AURA?


The results of the AURA experiment – including the false positive rate -  have been known for nearly two years. But that didn’t stop the county from pushing ahead – and it didn’t stop the gushy news coverage. It’s not clear what finally prompted DCFS to pull the plug. 

Perhaps it’s because, as Nash points out, all those false positives would further overload the system. More likely, it was an initiative by the State of California to try to come up with a “better” predictive analytics model.

Unlike AURA, developers of the new model are promising a completely open process, including consultation with various “stakeholders” and transparency about exactly what risk factors are used and how they are weighed - allowing anyone to “interrogate the algorithm.”

Also encouraging, Nash’s report, commissioned by the Supervisors themselves, is filled with warnings about the need to proceed “cautiously and responsibly.” He says a set of strict standards “to address the important operational legal and ethical considerations…” should be adopted “before considering the use of predictive-analytics models.”  Those standards should include “understanding how racism and other biases may be embedded in systemic data and addressing these within the model.”

Nash even noted that the independent journalism nonprofit ProPublica found exactly that bias in predictive analytics tools already in use in criminal justice.

All this means that, if nothing else, the nightmare of “Minority Report”- style policing in Los Angeles child welfare is at least another year or two away.

The bad news is that Nash’s report accepts the na├»ve view that once a good algorithm is created it can be properly controlled and limited. 

He writes:

Determining [predictive analytics’] “right” use – to identify families most in need of supports, rather than to trigger any negative consequences for them – will be fundamental.

But Nash, himself a former juvenile court judge, must know that’s now how child welfare works in the real world.

Whatever controls are in place at the outset will disappear the moment a child “known to the system” dies and the caseworker handling the case says “DCFS had all this information about the family, and they knew it was ‘high risk’ but they didn’t tell me.” 

Philip Browning
Then, all bets - and all restrictions - are off, and it will be take-the-child-and-run in every family where the computer spits out a high "risk score."

One more bit of bad news: One of the strongest boosters of predictive analytics in Los Angeles, former DCFS director Philip Browning, has been hired as a consultant to “help” New York City’s child welfare agency.


SDM is let off the hook


The other bad news concerns the other model of risk and safety assessment that the Supervisors asked Nash to study – the one currently used in Los Angeles - Structured Decision-Making.

Like predictive analytics, SDM also has been found to raise issues of racial and class bias. Nash acknowledges those issues in passing:

Users of the tool, in particular, fault it for not incorporating into its assessments theentire story of what is happening within a family, but instead focusing on a few broadstrokes without giving weight to important nuances. Users additionally state that the toolis too narrowly focused on the caregiver and does not take into account the strengths ofthe family as a whole.

But immediately he adds this parenthetical aside:

(The latest version of SDM has been revised to try to be more strength-based in its approach.)

But in my own experience, some version of  “Yes, but the new version is different” is what developers of SDM have said for more than a decade, each time similar concerns are raised.  That can only leave one wondering about all the “risk assessments” and “safety assessments” performed with old, unimproved versions of SDM.

The defeat of AURA shows that, contrary to what some predictive analytics proponents say in their worst moments of hubris, it is not inevitable that every legislative body and child welfare agency will embrace this latest fad in child welfare.

At a minimum, opponents in Los Angeles have more time to organize. And using predictive analytics in child welfare no longer has an AURA of inevitability.

Sunday, May 7, 2017

New Column: You can’t fix child welfare spending with distorted data and doublethink

Listen closely. That giant sucking sound you hear is the foster care-industrial complex grasping for every dollar it can swipe from every possible “funding stream.”

George Orwell gave us the concept of  doublethink.
Foster care advocates perfected it.
 In 1984, George Orwell defined “doublethink” as holding two contradictory beliefs in one’s mind simultaneously, and accepting them both.

In child welfare, for example, we have been told for decades that child welfare systems don’t take away children because their families are poor. ... But now we also are told in a column by too advocates of taking away more children, that every single federal program designed to ease poverty – including housing assistance, food stamps, even the Supplemental Security Income program for the aged, blind and disabled – is a foster care prevention program, and every dime from every one of them should be counted as child welfare spending.

In other words, great gobs of money are going to prevent something – removal of children from their parents because they are poor – that child welfare agencies say they don’t do anyway.

Orwell would have recognized the technique. 

Thursday, May 4, 2017

Philadelphia RTC is the latest in a long line of rotten barrels

It wasn’t the repeated rapes that finally forced the state of Pennsylvania to shut down the  Wordsworth “residential treatment center” in Philadelphia.  It wasn’t the assaults by staff against children and children against each other.  It wasn’t the fact that over ten years, police were summoned to the place more than 800 times.

It wasn’t even the enormous cost to taxpayers - $119,000 per year per child for all this tender loving care – that prompted the state finally to act.

No, the Philadelphia Inquirer and Philadelphia Daily News report, a 17-year-old, David Hess, had to die first, during a struggle with staff. Authorities ruled the death a homicide.

Through all of this, year after year after year, the Philadelphia Department of Human Services kept warehousing children at Wordsworth – children as young as age 10.  Some were delinquent, others were said to have been abused or neglected.

It’s not as if nobody knew what was going on.  As the newspapers report:

“Interviews, court records, state inspection reports, and police records reveal a trail of injuries to children, from broken bones to assaults to the suffocation death of Hess. Along the way, lawyers, licensing inspectors, and others found conditions there appalling and sounded the alarm with little success.”

Why wouldn’t the City or the State do more? They didn’t dare.  In Philadelphia substitute “care,” in all its forms, is a sellers’ market. As Joan Erney, director of Community Behavioral Health, the agency that oversees publicly funded mental-health services for Philadelphia told the newspaper:

“Our approach to agencies generally is that we need them, and if there are opportunities to improve, we work with them. … We did rely on Wordsworth extensively. Places outside of Philadelphia don’t want to take our kids. They tell us our kids are too complicated. They tell us our kids are too hard. We have kids with some really difficult problems.”

In other words, they were begging for beds, and beggars can’t be choosers.

But that tells only part of the story. The real reason Philadelphia turned a blind eye to the horrors at Wordsworth is because of Philadelphia’s long, ugly history of embracing worst practice in child welfare.

● Among America’s ten largest cities and their surrounding counties, Philadelphia tears apart families at the second highest rate when rates of child poverty are factored in. (When you don’t factor in poverty, Philadelphia is #1.) The rate of removal in Philadelphia is 60 percent above the big-city average, more than triple the rate in New York City and more than quadruple the rate in Chicago.

Were Philadelphia taking children at the rate of New York or Chicago it would have plenty of room in good therapeutic foster homes for children who really needed them – and no need to warehouse children in places like Wordsworth.

● Philadelphia needs something else, too: The guts and imagination to embrace safe, proven alternatives to residential treatment.  One of the striking revelations in the Inquirer / Daily News story is the fact that the RTC at Wordsworth wasn’t some hundred-year-old orphanage that rebranded itself to stay in business and then deteriorated. This facility was brand new in 2006 – and apparently it was abusive almost from day one.

In other words, at a time when most of the rest of the country was trying to shut down institutions, city officials in Philadelphia and their state counterparts in Harrisburg thought it would be a great idea to send children to a brand new one.

● And no, the almost universal cry of those who institutionalize children and their apologists – the claim that the children are just too difficult to handle in families – is not true.  There is nothing a “residential treatment center” can do that can’t be done better (and at lower cost) through Wraparound programs.

As they name implies, such programs do whatever it takes – bringing the help a child needs into her or his own home or a foster home.  In this video, Wraparound pioneer Karl Dennis describes how it worked on the kind of case that usually lands a child in a place like Wordsworth.

Not only does Philadelphia overuse institutionalization; it institutionalizes children for whom the harm is greatest: younger children.  This is such horrific practice that in his original version of the proposed Family First Act, Sen. Orrin Hatch (R-Utah) proposed to simply eliminate all federal aid for any placement in any institution for any child under age 13.

That never passed, of course.  So all American taxpayers continue to subsidize places like Wordsworth.

● Worst of all, there’s no guarantee that the children are any better off now that Wordsworth is closed. Because the children were simply shipped to other institutions, often out-of-state – so it will be even harder to keep track of what happens to them.

Even when institutions don’t become hellholes, rife with physical and sexual abuse, a mountain of research shows that they are inherently bad for children, and there are better alternatives.  And there is nothing unusual about the kind of abuse that was rife at Wordsworth.   The Wordsworth story is repeated in America over and over, year after year. When the topic is institutionalization, we’re not talking rotten apples. We’re talking rotten barrels.



Monday, May 1, 2017

New columns on race and class bias in child welfare from the 19th Century to today

NCCPR has two new columns on racial bias in child welfare.  One deals with how the same newspaper can expose racial bias in policing while remaining blind to it in child welfare.
Read the column here.

Another deals with the the desperate lengths to which some will go to deny there's a problem.
Read the column here.

This column deals with another example of how the biases in child welfare are magnified by the latest fad in the field, "predictive analytics."
Read the column here.

None of this is new. In fact, American child welfare has its very roots not in benevolence but in bigotry. That's the topic of this column for The Daily Progress in Charlottesville, Va. It sets the record straight about Charles Loring Brace and his "orphan trains."
Read the column here.