Algorithm for identifying child neglect cases is becoming a matter of concern

Cindrella Kashyap
Cindrella Kashyap May 1, 2022
Updated 2022/05/01 at 8:11 AM
An algorithm that screens for child neglect
An algorithm that screens for child neglect- illustration by PETER HAMLIN, ASSOCIATED PRESS
An algorithm that screens for child neglect
An algorithm that screens for child neglect- illustration by PETER HAMLIN, ASSOCIATED PRESS

Family law attorney Robin Frank has always found it difficult to defend parents at one of their lowest points, that is, when they are at the risk of losing their children. However, in the past she knew what she was up against in the family court. But times are changing now and she is rather scared of fighting something she can’t see. In this case, it’s an opaque algorithm which help the social workers decide which families should be investigated for child neglect via statistical calculations.

Frank shared that many people don’t know that it’s being used even though they should have the right to all the information. The child welfare agencies from Los Angeles to Colorado and throughout Oregon use or have considered using similar tools like the one in Allegheny County, Pennsylvania. However, an associated press review has raised a number of concerns regarding the technology and how it is reliable in the child welfare system.

The Canegie Mellon University team conducted a research which shows that as compared to white children, the number of black children flagged for “mandatory” neglect investigation is disproportionate. According to some independent researchers, it has also been revealed that around one third of the time, the social workers have disagreed with the risk scores produced by the algorithm.

algorithm that screens for child neglect
Source- medicalxpress.com

The data obtained from this cutting-edge tool is being used to protect children from neglect in terms of inadequate housing to poor hygiene. However the cases related to physical or sexual abuse is not subject to the algorithm.

The advocates are worrying about the fact that if similar tools are being used in other child welfare systems with negligible human intervention, it can lead to reinforcing existing racial disparities in the field.

When the social workers in the Mojave Desert city of Lancaster started using the tool, the results portrayed that nearly half of all the investigations pointed out for extra scrutiny are subject to Black children. The county has not revealed the reason behind it but they will apparently reveal as to whether the tool will be expanded later this year.

Meanwhile, Frank is still trying to figure out how the algorithm is creating an impact on each client and how the number has influenced the outcomes for the families.

Share this Article