From BLM to the Belarus Revolution - why is facial recognition so scary?
Police in the US have reportedly used facial recognition technology (FRT) to arrest protestors from the Black Lives Matter movement. In Belarus, law enforcement is running two different FRT projects. It’s not clear at the moment whether they actively deploy it to track protesters, but as dozens of people have already disappeared during the revolt against Alexander Lukashenko, it’s definitely not a laughing matter. Human rights activists across the globe are actively urging governments to ban FRT.
“Face recognition gives much greater power to the police, much like pistols give them an advantage over wooden sticks,” Nicolas Kayser-Bril from Algorithm Watch told CyberNews.
The revolution in Belarus and the Black Lives Matter movement in the US are just a few examples of recent protests. As nations around the world are rising to fight injustice, inequality, authoritarian regimes, and censorship, governments are deploying cutting-edge technologies to crack down on protests. Therefore, human rights activists are worried and are calling for the FRT ban.
Read more: How encrypted messaging changed the way we protest
Big Russian Brother in Belarus
In Belarus, people have been protesting for two weeks already against the firm rule of Alexander Lukashenko. Every day, they are gathering in the streets to make a peaceful call to "Stop the Cockroach." The authoritarian regime has used violence to crack down on protests, thousands of people have been detained, and dozens have disappeared. At this moment, it’s not clear whether Belarus is using technology to do surveillance on the streets, but FRT technology is already in place. Unsurprisingly, it’s Russian made.
Police in Belarus have been using Kipod face recognition cameras installed by Russian IT giant Synesis. According to a law signed by Lukashenko in 2017, every business controlled by the state must install cameras and join the surveillance system. The process of connecting to the system has been rather slow as the fee for using it is quite high for the business entities. The government would be able to use the system for the sake of national security. Belarus has an ambitious goal to install 360,000 surveillance cameras in the country in the next 5 years.
In Minsk, the government just started using another surveillance system that can read people’s faces and cars’ license plates. It’s a pilot project at the moment installed and run by the Russian entity Macroscop.
In June, The New York Times published a story “Wrongfully accused by an algorithm” about Robert Williams who was wrongfully matched with the perpetrator of a crime he didn’t commit.
In the United States, around one quarter of law enforcement agencies can use facial recognition. According to this Atlas of Surveillance, a project by the Electronic Frontier Foundation, Florida is the state with most FR cameras in place.
Law enforcement has been using these cameras to crack on Black Lives Matter protests. Miami police arrested a woman for allegedly throwing a rock at a police officer in May, NBC Miami reported. Police used Clearview AI to identify the woman.
In Philadelphia, law enforcement has used facial recognition technology to identify people from Instagram photos. There have been more arrests based on FRT scanning results, Arstechnica reported.
Detroit police admitted that FRT misidentifies suspects about 96% of the time. Therefore, human rights organizations around the globe are calling for the FRT ban as biometric surveillance violates civil rights.
FRT systems work very poorly
“Face recognition gives much greater power to the police, much like pistols give them an advantage over wooden sticks. Any government that requires power over demonstrators can probably make use of face recognition to its advantage, thereby raising the bar needed for demonstrators to effectuate change,” Nicolas Kayser-Bril from Algorithm Watch told CyberNews.
Algorithm Watch, the Electronic Frontier Foundation, and Transparency International are among organizations calling to ban FRT systems. They are inaccurate, and governments using them often violate civil liberties, such as the right to protest.
“Face recognition can be dangerous in that it fundamentally alters the power balance between the police and citizens. It enables the police to track people in time and space at very little cost, so that, if the police want to arrest someone, they can review records and look for a minor offense (e.g. jaywalking) to start prosecution,” said Nicolas Kayser-Bril.
More importantly, FRT lets the police identify participants in public demonstrations, thereby nullifying one of the most important elements of mass demonstrations: anonymity and protection through numbers.
"As such, it could dramatically alter the power of demonstrators in any context, as all participants in an unauthorized demonstration could be liable for taking part in it,” said the Algorithm Watch spokesperson.
He also stressed that FR systems work very poorly, and therefore are likely to engender an even bigger chilling effect among the general population.
Face recognition can be dangerous in that it fundamentally alters the power balance between the police and citizens. It enables the police to track people in time and space at very little cost, so that, if the police want to arrest someone, they can review records and look for a minor offense (e.g. jaywalking) to start prosecution,says Nicolas Kayser-Bril.
Another problem with FRT is that it’s hard to say how often it’s used to track down people’s movement. In some cases, governments used it even if it was prohibited by the law.
“Police forces can and do apply face recognition technology to video data that has been recorded previously. Also, police forces can rely on live face recognition technology built and operated by private actors, as is the case in Madrid,” explained Nicolas Kayser-Bril.
Would a mask help a protestor?
“Wearing a mask has been shown to reduce the accuracy of face recognition systems. However, other systems, which are available in Belarus (such as BriefCam, which is distributed by the company Microinform), can use clothes and gait recognition to track people,” explained Nicolas Kayser-Bril.
FRT prone to errors
Karen Gullo from the Electronic Frontier Foundation explained that facial recognition technologies are used by law enforcement to analyze large image sets of people in public spaces, so some people at protests prefer to cover their face and tattoos, or to limit identifiers, or use changes of clothing.
“Facial recognition is a privacy-invasive technology, and government use of facial recognition chills people's rights to engage in activities like protests and demonstrations. Facial recognition technologies are flawed and prone to errors, especially when it comes to people of color and women, and they have an unfair and disparate impact against people of color, immigrants, and other vulnerable communities,” Karen Gullo told CyberNews.
It’s well documented that, when it comes to people of color, FR cameras are less accurate. Research fellows from the University of Michigan’s Ford School of Science Claire Galligan, Hannah Rosenfeld, Molly Kleinman, and lead researcher Shobita Parthasarathy concluded that FR cameras erode privacy, define the notion of an “acceptable” student, incite racism, commodify data, and institutionalize inaccuracy. Therefore, researchers strongly recommend banning FR technology in schools.