The artificial intelligence tool used to monitor child abuse allegedly targets parents with disabilities

The artificial intelligence tool used to monitor child abuse allegedly targets parents with disabilities

Since 2016, Pennsylvania County social workers have relied on an algorithm to help them identify child care calls that require further investigation. Now, the Department of Justice is reportedly examining a controversial Family Screening tool over concerns that use of the algorithm could violate the Americans with Disabilities Act by allegedly discriminating against families with disabilities, The Associated Press reportedincluding families with mental health problems.

Three anonymous sources have breached their confidentiality agreements with the Justice Department, confirming to the Associated Press that civil rights attorneys have been filing complaints since last fall and are increasingly concerned about alleged biases embedded in the investigation. Allegheny County Family Screening Tool. While the full extent of the Department of Justice’s alleged scrutiny is currently unknown, the Civil Rights Division appears interested in learning more about how using the data-driven tool could lead to strengthening historical systemic biases against people with disabilities.

The county describes its Predictive Risk Modeling tool as a preferred resource for minimizing human error for social workers benefiting from the algorithm’s rapid analysis of “hundreds of data items for each person involved in a child abuse allegation.” This includes “data points associated with disabilities in children, parents, and other members of local households,” Allegheny County told the Associated Press. These data points contribute to an overall risk score that helps determine whether a child should be removed from their home.

Although the county told the AP that social workers can override the tool’s recommendations and that the algorithm has been updated “multiple times” to remove data points related to disabilities, critics are concerned that the screening tool may still be automating discrimination. This is particularly worrisome because the Pennsylvania algorithm has inspired similar tools used in California and Colorado, the Associated Press reports. Oregon stopped using the Family Screening tool over similar concerns that its algorithm could exacerbate racial bias in its child care data.

The Justice Department has not yet commented on its alleged interest in the tool, but the AP reported that the department’s scrutiny could turn an ethical argument against the use of child welfare algorithms into a legal one.

The University of Minnesota expert on child welfare and disabilities, Traci LaLiberte, told the AP that it is unusual for the Department of Justice to get involved in child welfare cases. “He really has to live up to the great interest of devoting time and participating,” Laliberte told the AP.

Ars could not immediately reach the algorithm’s developers or the Allegheny County Department of Human Services for comment, but a county spokesperson, Marc Bertolet, told The Associated Press that the agency was not aware of the Justice Department’s interest in its screening tool.

Problems predicting child maltreatment

Allegheny County said on its website that the Family Screening tool was developed in 2016 “to enhance child care call screening decision-making with the sole goal of improving child safety.” That year, the county reported that before the algorithm was used, human error led to Child Protective Services investigating 48 percent of the least serious cases, while ignoring 27 percent of the more serious cases. a 2016 External ethical analysis Supporting Boycott’s use of the algorithm as “deterministically imperfect” but a relatively more accurate and transparent method of risk assessment rather than relying on clinical judgment alone.

“We concluded that by using technology to collect and evaluate all relevant information available, we can improve the basis for these critical decisions and reduce variance in employee decision-making,” the county said on its website, promising to continue to improve the model. Tool analysis was performed.

Although the county told the AP that risk scores alone do not lead to investigations, the county’s website still says that “when the score is at the highest levels, and you meet the ‘mandatory screen’ threshold, the allegations in the call should be investigated.” . Because data points on disability contribute to this finding, critics suggest that families with disabilities are more likely to be targeted for investigations.

The same year that the Family Screening Tool was introduced, the Christopher and Dana Reeve Foundation and the National Council on Disability Released a toolkit To help parents with disabilities know their rights when they fight in court over childcare concerns.

“For many of the 4.1 million parents with disabilities in the United States, courts have determined that they are not good parents simply because they have disabilities,” the organization wrote in the introduction to the toolkit. In fact, as of 2016, 35 states still say that if you have a disability, you may lose your right to be a parent, even if you do not harm or ignore your child.

Allegheny County told the Associated Press that “it should come as no surprise that parents with disabilities … may also need additional supports and services.” No County moral analysis Nor her Instructions It directly discusses how the tool can harm these families.

Ars couldn’t reach LaLiberte for additional comment, but she told The Associated Press that her research also showed that parents with disabilities are already disproportionately targeted by the child welfare system. She suggested that incorporating disability-related data points into the algorithm seems inappropriate because it directs social workers to look at “characteristics that people cannot change” rather than exclusively assessing problematic behaviour.

Leave a Reply

Your email address will not be published. Required fields are marked *