Child Welfare Algorithm May Unfairly Target Disabled Parents, Complaints to DOJ Allege

Must read

The Justice Department has reportedly been examining an algorithm used by one Pennsylvania county’s child welfare agency to help determine which allegations of child neglect deserve a formal investigation, following a series of complaints that the algorithm is unfairly targeting parents with disabilities. While the county claims that the algorithm is intended to reduce human error in child welfare investigations, critics argue that the tool places disabled parents—who are already disproportionately investigated by child welfare agencies—at risk of unnecessary government intervention.

According to the Associated Press, in 2016, Allegheny County—where Pittsburgh is located—began using “The Allegheny Family Screening Tool,” an algorithm designed to help social workers better identify which families needed to be investigated for child neglect—a broad term encompassing everything from leaving children unsupervised, to not having enough food, to frequent school absences.

The tool compiles data from “Medicaid, substance abuse, mental health, jail and probation records, among other government data sets,” and generates a “Family Screening Score.” According to the county’s website, a high score indicates a high likelihood that the child will be seized by state authorities in the future. “When the score is at the highest levels, meeting the threshold for ‘mandatory screen in,’ the allegations in a call must be investigated,” the county’s website reads.

According to the A.P., the Justice Department has been receiving complaints about the algorithm since at least last fall. The complaints primarily focus on the algorithm’s inclusion of disability-related data in its Family Screening Score, a practice that could be unfairly punishing to disabled parents—and possibly violate the Americans with Disabilities Act.

The county seems to support claims that its algorithm singles out disabled parents, telling the A.P. that when data related to disabilities is included, it “is predictive of the outcomes,” adding that “it should come as no surprise that parents with disabilities … may also have a need for additional supports and services.”

The full extent of the Justice Department’s involvement is unknown. However, two anonymous sources to the A.P. that attorneys from the Justice Department’s Civil Rights Division “[urged] them to submit formal complaints detailing their concerns about how the algorithm could harden bias against people with disabilities, including families with mental health issues.”

Allegheny County claims its algorithm is simply a tool used to make it easier to screen families for possible child welfare investigations, insisting that the tool was responsibly designed. “The design and implementation of the AFST was a multi-year process that included careful procurement, community meetings, a validation study, and independent and rigorous process and impact evaluations,” the county’s website reads. “In addition, the resultant model was subjected to an ethical review prior to implementation.”

But critics argue that these kinds of algorithms frequently end up unfairly targeting families due to their race, income, or disabilities. “When you have technology designed by humans, the bias is going to show up in the algorithms,” Nico’Lee Biddle, a former Allegheny County child welfare worker, told the A.P. in an earlier investigation into the Family Screening Tool last year. “If they designed a perfect tool, it really doesn’t matter, because it’s designed from very imperfect data systems.” In June of last year, a similar algorithm in use in Oregon was discontinued over concerns that it was racially biased.

Parents with disabilities are already at heightened risk of losing their children to state custody. While Allegheny County’s algorithm may be intended to help social workers make better decisions, it could end up further ingraining biases against disabled parents.

“I think it’s important for people to be aware of what their rights are,” Robin Frank, a family law attorney representing an intellectually disabled man whose daughter was seized into state custody, told the A.P. “And to the extent that we don’t have a lot of information when there seemingly are valid questions about the algorithm, it’s important to have some oversight.”

More articles

Latest article