First County In U.S. Bans Facial Recognition Technology Use By Government & Police
The ACLU is calling for a federal ban on facial recognition technology, which has disproportionately misidentified people of color.
Washington’s largest county has become the first in the U.S. to prohibit government agencies and law enforcement from using facial recognition technology, according to officials. The county’s move comes as an effort to preserve civil liberties and follows similar bans in cities including San Francisco, Boston, and Portland.
King County Council members voted unanimously Tuesday to ban government agencies from using facial recognition software in most cases. The council says King County, which encompasses Seattle and has more than 2.3 million residents, is the largest jurisdiction in the U.S. to pass such a ban.
Under the ban, authorities will not be allowed to use the software to track someone unless they obtain a warrant, are searching for a missing person, or under “exigent circumstances,” which are generally legally defined as exceptions for a search without a warrant.
Facial recognition software analyzes photos of faces “for the purpose of identifying them” and has previously been criticized by civil liberties groups for enabling law enforcement and other institutions to invade privacy. According to the ACLU, public security cameras using the technology can capture images of a person without consent. Law enforcement agencies have used this technology in recent years to identify potential criminal suspects.
“Now it's time for a federal ban on government use of facial recognition to ensure that no one's civil liberties and civil rights are violated by a pervasive and often inaccurate technology that disproportionately misidentifies people of color and heightens the risk of surveillance and deadly encounters with law enforcement in already marginalized and overpoliced communities,” Jennifer Lee of the ACLU of Washington said in a statement.
According to the King County Council’s announcement, the ban is designed to “protect our residents’ civil liberties and freedom from government surveillance and demographic biases.” Multiple studies have found that facial recognition software often misidentifies people of color, particularly Black women, “in part because of a lack of diversity in the images used to develop the underlying databases,” the News York Times reports.
A 2019 study from the National Institute of Standards and Technology evaluated 189 softwares and found overwhelming evidence that the technology was racially biased. The algorithms analayzed in the study were 10 to 100 times more likely to mistake two non-white people for each other compared to false positives for white people.
“The use of facial recognition technology by government agencies poses distinct threats to our residents, including potential misidentification, bias, and the erosion of our civil liberties,” King County Councilmember Jeanne Kohl-Welles said in a statement. “The use or misuse of these technologies has potentially devastating consequences which the new ordinance will help to prevent.”