Thursday, October 31, 2019

NCCPR news and commentary round-up, week ending October 30, 2019


● Back in 2004, a huge, comprehensive study of that most sacred cow in child welfare, Court-Appointed Special Advocates found that CASAs only accomplishments were to prolong foster care and make it less likely that foster children would be placed with relatives instead of stranger.  Now there’s an even bigger, more comprehensive study – and the results are even worse.

Compared to children who were not assigned CASAs, children with CASAs were less likely to wind up in a permanent home and more likely to endure the worst outcome of all: “aging out” of foster care with no home at all.  I have a blog post about it, with a link to the full study.  I’ve also reprinted our annual Halloween reminder to CASA: It’s not a good idea to have a blackface act at your fundraiser.

● And now for something completely different: One of the best recent takes on the child welfare system came last week from Full Frontal with Samantha Bee.  Full Frontal examined the racial and class biases that permeate the system and do enormous harm to children needlessly taken.

The program didn’t get everything right – highlighting a so-called foster parent “shortage” that is, in fact, artificial, created by the very problems Full Frontal examined.  But the segment still is notably better than what we get from many “serious” news organizations.  That’s probably because the writing staff for Full Frontal is more diverse than the staff of most “serious” news organizations.  Have a look:


● Last week, I wrote a blog post about a story in The Correspondent which noted that IBM’s supercomputer, Watson was doing a poor job of telling doctors how to treat illnesses.  In fact, some of Watson’s recommendations were dangerous.  It was one more example about how the hype about using big data to solve our problems doesn’t match the reality.  But at least Watson wasn’t guilty of racial bias.  The Washington Post reports that is more than can be said for another widely-used health care predictive analytics algorithm. That algorithm significantly underestimated the health needs of Black patients.  So if predictive analytics is racially biased when used in law enforcement – which it is – and racially biased when used in medicine – which it is – what does that say about using predictive analytics in child welfare?

● One should never take current or former child welfare agency leaders at their word when they brag about their success.  But when it comes to the claims in this column by the former Commissioner of the Connecticut Department of Children and Families, Joette Katz, the numbers back up the words.  Connecticut did, indeed, dramatically reduce institutionalization and increase the use of kinship foster care during Katz’s tenure.  This column also explains the right way to respond to cases involving substance abuse.

● Nebraska long has removed children at one of the highest rates in the country.  Officially those numbers are finally going down.  But is Nebraska reducing foster care, or just hiding it?