Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

Jilly_in_VA

(9,962 posts)
Fri Apr 29, 2022, 10:37 AM Apr 2022

An algorithm that screens for child neglect raises concerns

Inside a cavernous stone fortress in downtown Pittsburgh, attorney Robin Frank defends parents at one of their lowest points – when they are at risk of losing their children.

The job is never easy, but in the past she knew what she was up against when squaring off against child protective services in family court. Now, she worries she’s fighting something she can’t see: an opaque algorithm whose statistical calculations help social workers decide which families will have to endure the rigors of the child welfare system, and which will not.

“A lot of people don’t know that it’s even being used,” Frank said. “Families should have the right to have all of the information in their file.”

From Los Angeles to Colorado and throughout Oregon, as child welfare agencies use or consider tools similar to the one in Allegheny County, Pennsylvania, an Associated Press review has identified a number of concerns about the technology, including questions about its reliability and its potential to harden racial disparities in the child welfare system. Related issues have already torpedoed some jurisdictions’ plans to use predictive models, such as the tool notably dropped by the state of Illinois.

According to new research from a Carnegie Mellon University team obtained exclusively by AP, Allegheny’s algorithm in its first years of operation showed a pattern of flagging a disproportionate number of Black children for a “mandatory” neglect investigation, when compared with white children. The independent researchers, who received data from the county, also found that social workers disagreed with the risk scores the algorithm produced about one-third of the time.

https://apnews.com/article/child-welfare-algorithm-investigation-9497ee937e0053ad4144a86c68241ef1
____________________________________________________________________________________
PLEASE read the linked article. The whole thing. To paraphrase "Shane", an algorithm is a tool; it's as good or as bad as the person using it. It also should not be kept secret, nor should the "results".

2 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies
An algorithm that screens for child neglect raises concerns (Original Post) Jilly_in_VA Apr 2022 OP
I do child protection defense work. no_hypocrisy Apr 2022 #1
A wonderful way to make underfunding CPS seem logical. Phoenix61 Apr 2022 #2

no_hypocrisy

(46,067 posts)
1. I do child protection defense work.
Fri Apr 29, 2022, 10:43 AM
Apr 2022

I had a client almost lose her children solely for being poor. Children were dazzled by the affluence of their foster parents and wanted their mother to let them be adopted. It was a CF but we got the kids back.

Phoenix61

(17,000 posts)
2. A wonderful way to make underfunding CPS seem logical.
Fri Apr 29, 2022, 11:44 AM
Apr 2022

I’m reminded of the saying, “If you can’t dazzle them with brilliance baffle them with bullshit.” I’ve worked as a CPS investigator and there’s a huge difference between a family that is struggling to feed their children and one that chooses not to. There’s no way to “algorithm” your way to knowing who is who. They say money can’t solve everything but it sure can solve some things.

Latest Discussions»Issue Forums»Editorials & Other Articles»An algorithm that screens...