Skip to main content
search
0

A Look at Child Welfare in the Digital Age

By Gia Scotti,
Smith Business Law Fellow
J.D. Candidate, Class of 2025

“If it isn’t part of the behavior, then having it in the algorithm biases it.” – Traci LaLiberte

The child welfare system is a comprehensive framework.1 It encompasses a multitude of services, such as child protection, family preservation, kinship case, foster care placements, and adoption services.2 The primary purpose of child welfare is to maintain a safe and secure environment and to protect vulnerable children from harm.3 Interwoven in this system is a series of decision-making processes that are made in the screening and investigation of abuse or allegations of neglect.4 The second purpose is to connect families to services that will better the conditions in his/her homes, making it a safer environment for at risk children.5 Unfortunately, there are a substantial number of cases of maltreatment, and most of the services’ resources are dedicated to the “back-end” of the system, meaning it specifically focuses on children who have been removed and placed in foster care.6 However, this focus had led to a neglect of the operational sides of child welfare, specifically in developing decision skills for call screeners, supervisors, caseworkers, or other front-end workers.7
Since artificial intelligence (“AI”) has increased in popularity, it is predictable that AI has found a way to integrate itself within the child welfare area. Studies have been conducted focusing on the incorporation of AI-based decision support tools (“ADS”) in this field. One study centered on the Allegheny Family Screening Tool (“AFST”), which assessed the risk of child maltreatment.8 First, an external caller contacts the hotline to make a report, the call screener is then tasked with recommending whether to proceed with an investigation, the screener gathers information to run the AFST tool, which then assigns a score from 1 (minimal risk of future placement) to 20 (substantial risk of future placement).9 Greater priority is given to the cases that receive a higher score, leading the social worker to proceed in further observation, investigation, or intervention of the particular case.10
Despite the integration of AI into child welfare services, the study findings indicated that it had minimal influence within this area. To know whether a child is at a higher risk of neglect, workers emphasize that he/she needs to rely on one’s own personal experience, cultural background, familial history, and the potential motives of the caller; factors which AI does not consider.11 Additionally, most workers did not have prior knowledge or training on the AFST, including the data it relies on or how to work with the tool effectively.12 Furthermore, this study highlighted important design implications for agencies using or implementing ADS tools, such as (1) leveraging workers’ experience to improve an ADS tool’s performance; (2) designing training tools that support workers in understanding the boundaries of ADS tool’s capabilities; (3) supporting open, critical discussion around the tools, (4) providing workers with balances and contextualized feedback on their decisions, (5) codesigning measures of decision quality with the workers, (6) communicating how decision-making power should be distributed among workers and the ADS tool, and (7) to support diverse stakeholder involvement in shaping ADS tool design.13
Recently, the Allegheny Family Screening Tool has faced serious scrutiny in how it aids social workers in deciding which families to investigate. Complaints have been filed concerning how the algorithm can have a potential bias against people with disabilities and mental health issues.14 For instance, Robin Frank, a family law attorney, and critic of AFST, filed a complaint on behalf of her client with an intellectual disability, in which the client was fighting to regain custody of her daughter from foster care.15 Additionally, critics of AI raise concerns of not overloading it with crucial decision-making, because this can result in discrimination against families based on race, income, disabilities, or other external characteristics.16 For example, child welfare officials in Oregon have been cautioned to stop using their own model algorithm, influenced by the AFST, to help decide which families should be investigated by social workers, because the data flagged a disproportionate number of Black children for mandatory neglect investigations.17 The stakes are paramount in these situations, because not addressing an allegation could lead to eternal suffering for a child, but wrongful interference in family’s life can cause a parent to lose their child forever.
Overall, AI has influenced everyday life by incorporating itself in a multitude of areas, so it is imperative that when using such a powerful tool, it is handled in a manner that provides a positive outcome, while also mitigating downsides.18 The area of child welfare is a comprehensive framework with a mission to keep children physically, emotionally, and mentally secure. As the years progress, technology advances, which is why there needs to be a trustworthy foundation between this powerful tool and those who may or must rely on it. Hopefully, in the forthcoming years, social workers can confidently depend on AI, ensuring that the mission of child welfare is fulfilled without any child being overlooked or neglected. Although technological advancements yield positive impacts, it is imperative that in this area, AI does not lead social workers to remove a child from a nurturing home. These types of innovations should serve as a tool and must not replace the judgment and empathy of social workers. The fundamental objectives in any endeavor are justice and equality, and the prudent integration of artificial intelligence is essential in child welfare, particularly if social workers are going to rely on it in making decisions regarding a child’s well-being.

References:

1 Kawakami et al., Improving Human-AI Partnerships in Child Welfare: Understanding Worker Practices, Challenges, and Desires for Algorithmic Decision Support, CHI Conference on Human Factors in Computing Systems (2022).2 Id.
3 Id.
4 Id.
5 Id.
6 Id.
7 Id.
8 Id.
9 Id. at 4.
10 Id. at 4.
11 Id. at 6.
12 Id. at 13.

13 Use of Artificial Intelligence-Based Decision Tools in Child Welfare, Children’s Bureau Express (November 2022), https://cbexpress.acf.hhs.gov/article/2022/november/use-of-artificial-intelligence-based-decision-tools-in-child-welfare/1b543cec1b4291909e7ba97ae54bcb7c.

14 Ho, Sally & Burke Garance, Child Welfare Algorithm Faces Justice Department Scrutiny, AP News (January 31, 2023, 10:00 PM), https://apnews.com/article/justice-scrutinizes-pittsburgh-child-welfare-ai-tool-4f61f45bfc3245fd2556e886c2da988b.

17 The Associated Press, Oregon is Dropping an Artificial Intelligence Tool Used in Child Welfare System (June 2, 2022, 1:12 PM), https://www.npr.org/2022/06/02/1102661376/oregon-drops-artificial-intelligence-child-abuse-cases.

18 Kawakami et al., “Why Do I Care What’s Similar?” Probing Challenges in AI-Assisted Child Welfare Decision-Making through Worker-AI Interface

Close Menu

(239) 687-5300