Skip to main content
search

A.I. And Its Impact on Facial Recognition Software

By Paolo Vilbon,
The Gavel, Contributor
J.D. Candidate, Class of 2024

Artificial intelligence is the future and there is no denying that. But, with great advancements also comes the potential dangers associated with them. One of law enforcements biggest technological advancements in the last two decades has been the use of facial recognition technology. This coupled with modern artificial intelligence would lead some to think that this system would completely revolutionize law enforcement investigations and their standard operating procedures. Unfortunately, this is not the case. According to researchers, facial recognition technologies falsely identified Black and Asian faces 10 to 100 times more often than they did White faces. The technologies also falsely identified women more than they did men—making Black women particularly vulnerable to algorithmic bias.1 

These algorithms currently help national agencies identify potential flight risks and protect borders.2 National agencies have an advantage over local law enforcement agencies because they possess the resources to cross check any information they receive, but local agencies do not have that kind of bandwidth. Further, it is no secret that efforts to recruit law enforcement officers have been on a downturn in recent years.3 This will lead police departments to rely more heavily on these technologies to fight crime. As the use of these systems increases, so will the errors associated with them. Therefore, if these technologies are not accurate or contain identifiable biases, they may do more harm than good. 

One of the issues identified with artificial intelligence and facial detection is that AI face recognition tools “rely on machine learning algorithms that are trained with labeled data.”4 Further, “[i]t has recently been shown that algorithms trained with biased data have resulted in algorithmic discrimination.”5 The potential dangers associated with erroneous identification range from “missed flights, lengthy interrogations, watch list placements, tense police encounters, false arrests, or worse.”6 All of which ignore the financial impact that a false identification will have on the individual. Society must hold companies who put face recognition tools into the marketplace accountable in the hopes that new development of technologies will be much more accurate. This would ensure that future algorithms will prevent harm to the individuals that these technologies are biased against.

Artificial intelligence is far too embedded into daily life to slow its progress but claims that the data set used for its baselines is biased should not be ignored. These biases should be brought to the forefront so that the necessary changes can be made now before artificial intelligence needlessly overburdens the criminal justice system. A yearlong research investigation across 100 police departments revealed that African American individuals are more likely to be stopped by law enforcement and be subjected to face recognition searches than individuals of other ethnicities.7 This happens because, without a dataset that has labels for various skin characteristics such as color, thickness, and the amount of hair, one cannot measure the accuracy of such automated detection systems. Although it may sound ridiculous, we are at a turning point when it comes to this technology. If this technology is continuously used with the current biases it has, it will be useful, but will also lead to mass incarceration of the wrong suspects. This will then negatively harm the government and impacted individuals economically while also carrying a negative social impact. It is imperative that we realize that these biases exist so they can be corrected now. 

References:

1 The Regulatory Review. “Saturday Seminar: Facing Bias in Facial Recognition Technology.”

2 Id. 

3 U.S. Experiencing Police Hiring Crisis

4 Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification.

5 Id.

6 Id.

7 Id.

Close Menu

(239) 687-5300