Predictive analytics identify families in need of child services

In 2018, British local authorities implemented predictive analytical software designed to identify families in need of child social services. The algorithms were oriented towards improving the allocation of public resources and preventing child abuse (McIntyre and Pegg 2018). In order to build the predictive system, data from 377,000 people were incorporated into a database managed by several private companies. The town councils of Hackney and Thurrock both hired a private company, Xantura, to develop a predictive model for their children’s services teams. Two other councils, Newham and Bristol, developed their own systems internally.

Advocates of the predictive software argue that they enable councils to better target limited resources, so they can act before tragedies happen. Richard Selwyn, a civil servant at the Ministry of Housing, Communities and Local Government argues that “It’s not beyond the realm of possibility that one day we’ll know exactly what services or interventions work, who needs our help and how to support them earlier” (ibid). The predictive model incorporates a range of data: school attendance, housing association repairs, police records, indicators of antisocial behavior, domestic violence, et al (ibid).

Controversy has arisen regarding the system’s potential for violating information privacy and the possible bias created against families upon profiling them (ibid). Virginia Eubanks, a professor at the University of Albany, warns that such automated systems incorporate the biases of their designers as well as structural societal biases, and risk perpetuating discrimination while operating without any public scrutiny. She comments,

“We talk about them like no human decisions have been made, and it’s purely objective, but it’s human all the way through, with assumptions about what a safe family looks like,”

On top of this, the software acts within a private-public nexus that should raise concerns regarding the ethics of data usage in the presence of a clear profit motive. Xantura’s website advertises how its products can help local authorities “maximise PBR [payment-by-results] payments” from the government as well as reduce child safeguarding costs (Ibid). The software’s revenue generation scheme ought to be critically assessed.