Video analytic software offers to algorithmically process footage from police body cameras in order to classify behaviors and identify suspects in real-time (Díaz 2019). Artificial intelligence and object recognition software systems implemented by the NYPD have been promoted as solutions to solve systemic problems of police brutality accountability. However, they have their own set of issues, like the risk of false positives, the chilling of Free Speech and privacy rights, and racial bias (ibid).
According to the New York University School of Law report, survey respondents were largely favorable to the deployment of body cameras, although with certain restrictions. Citizens have called for police officers to notify people when cameras are recording, and a standardized way for the public to access the video files (ibid).
The rollout of this technology appears to be accelerating lately. The U.S. Department of Justice’s Bureau of Justice Assistance has found evidence that body cameras reduce the number of community member complaints and correlate with a reduction in the use of force on the part of police officers. In 2018 the NYPD announced that all police officers belonging to the force will be equipped with a body camera (Cherelus 2018). The trend seems likely to continue in spite of a recent incident in which a body camera burst into flames, causing NYPD to recall around 3.000 body cameras (Grayce 2018).