This article is 4 years old

Opinion

Facial Recognition Technology Desperately Needs Moratorium

On April 22, 2019, teenager Ousmane Bah filed a lawsuit against Apple claiming that he had been falsely arrested for the theft of $1,200 worth of Apple products in Boston.

On April 22, 2019, teenager Ousmane Bah filed a lawsuit against Apple claiming that he had been falsely arrested for the theft of $1,200 worth of Apple products in Boston. In 2015, the Sheriff’s Office in Jacksonville, Florida arrested Willie Allen Lynch for supposedly selling fifty dollars of cocaine to an undercover officer, for which Lynch was then sentenced to eight years in prison. These cases may seem unconnected, but in fact they have a great deal in common. In both cases the defendants denied committing the crime and both discovered later that facial recognition technology had been used to identify them. Not coincidentally, both men are black. Not only is it unconstitutional to pursue a criminal conviction without informing the accused that facial recognition software was used, but the technology is imperfect to say the least, and can easily lead to false identifications.

According to Joy Buolamwini, a computer scientist at the Massachusetts Institute of Technology, facial recognition algorithms are written by predominantly white software engineers, and the databases of faces on which the systems are built also consist of overwhelmingly white faces. Many algorithms work by identifying landmarks on the face, such as the distance between the eyes or the shape of the nose and human intervention is needed to instruct the software about what to notice and measure. When the code is written by mostly white software engineers, they — and the code they create — may be less sensitive to subtle differences between faces of people of other races. Because of this, the machines have become much better at recognizing white faces than black ones. In fact, in 2018, the American Civil Liberties Union put the photos of members of the US Congress into Amazon’s Rekognition software and the system falsely identified 28 of them as criminals. The misidentified lawmakers were disproportionately people of color.

This software has been adopted by police departments in ways that often violate privacy and, at times, basic human rights. On many occasions, police officers who employ facial recognition software will not inform the person they arrest that they used facial recognition and that the suspect is only one of many possible matches. The US Supreme Court ruled years ago that the constitutional right to due process requires that anyone who is accused of a crime be informed about the evidence being presented against them to ensure that defendants can present the best case and so that judges and jurors can make fully informed decisions. Using an algorithm doesn’t change these fundamental principles.

Over the summer, Georgetown University’s Law Center on Privacy and Technology disclosed that ICE had mined state databases of drivers licenses and used facial recognition technology to analyze the photos without permission. They looked through photographs in states where illegal immigrants are allowed to obtain driver’s licenses and, with the help of facial recognition software, used those photos to find and deport people. This not only affects people who are undocumented; anyone who may appear undocumented to this biased system may be targeted.

The solution to this is a nationwide moratorium on the use of facial recognition technology. San Francisco was the first city to ban this software in May of 2019, and Oakland was a close second, announcing its ban in June. These bans seemed extreme to many because bans are virtually permanent.

If perfected, facial recognition could lead to a much more effective and efficient police system, in which police officers would not need to spend valuable time trying to find a criminal based on only one photograph. Ideally, this would lead to a safer society overall.

To make this technology more accurate and useful to law enforcement, a few changes must be made. The faces that are inputted into the systems when they are created need to be far more diverse and so do the engineers that write the code. Furthermore, many more laws need to be put in place to create a just system that uses facial recognition, one that does not violate any rights.

It may seem as though this problem only affects people with criminal records, but in fact facial recognition surrounds us in ways we don’t even notice. It is in the filters we use on Snapchat and the way we unlock our phones, but one place it absolutely cannot be, at least for now, is in our criminal justice system.