Biometric mass surveillance, including facial and emotion recognition, is widespread throughout Europe. Activists claim it to be a major human rights violation, and the Reclaim Your Face movement is calling for an EU-wide ban on biometric surveillance. The campaign needs to gather 1 million signatures within the next 14 months for the European Commission to put it on the EU’s agenda.
Biometric mass surveillance results in an unlawful practice to treat everyone like a suspect, the activists behind the Reclaim Your Face campaign claim.
According to them, biometric mass surveillance is by its very definition a human rights violation because it treats every person as a criminal suspect, which undermines the right to the presumption of innocence.
Activists are calling for an EU-wide biometric surveillance ban. Reclaim Your Face has 14 months to gather 1 million signatures. It would oblige the European Commission to issue a formal Communication (a soft law) and start discussing the matter with the campaigners.
The rise in emotion recognition technologies
The European Digital Rights advocacy group have found evidence of more than 15 EU countries breaking their own human rights rules through uses of biometrics that lead to mass surveillance. Activists believe that the problem is more widespread than this.
There are many examples of the use of biometric surveillance in the EU. For example, local authorities and schools in France are implementing facial recognition against young people. In Germany, transport companies in collaboration with police are using it against commuters. In Denmark, event organizers in collaboration with police are using it against football supporters. Slovenia is using it against the protestors, Spain and Netherlands are using it against shoppers, Italy – against migrants, Poland is using it under the guise of COVID, EU research agencies are using it against travelers and asylum seekers, activists claim.
“It is not just people’s faces – we are also seeing other biometric characteristics like the rhythm with which someone types or how they walk used to identify them in ways that can be just as harmful. Ear shapes, vein patterns, irises, the way people smell, the way they distribute their weight on a chair and even their brainwaves are all ideas (either real or experimental) that we have heard about,” Ella Jakubowska, Policy & Campaigns Officer at European Digital Rights, told Cybernews.
The rise in so-called emotion recognition technologies is the latest trend that human rights advocates are witnessing.
“We are starting to see them being exported into Europe. For example, in the Netherlands, police and local authorities are widely claiming to be able to predict aggression through biometric mass surveillance technologies,” Jakubowska said.
How is it harming us?
Authorities that are using surveillance technologies claim it’s for the public’s safety. Yet, human rights advocates reckon that people who are being watched actually become less safe.
“The argument of efficiency is a thinly-veiled attempt to divert from the fact that authorities are spending vast sums of public money on technologies that are being aggressively pushed by private companies. Often, they are doing this instead of investing in welfare, social provisioning, education, and other strategies that are much more effective at tackling social issues,” Jakubowska said.
Biometric surveillance, she explained, is an issue for us all, not only for minorities, because it makes it much harder for us to stay anonymous and free in public.
“This, in turn, can make any of us less confident to access healthcare, to join a protest, to go into certain areas, and can even make our normal behaviors like waiting for a friend or putting our hood up appear suspicious,” Jakubowska said.
Yet, she claims that the technology runs along the same discriminatory fault lines that are embedded in our societies, which means that the tools of biometric surveillance can amplify discriminatory practices against people of color, religious groups, migrants, people with disabilities, LGBTQI+ people, and others.
“Already over-surveilled and over-policed groups will face even more targeting as a result – and whilst the inaccuracies in many facial recognition technologies can cause harm, if they are 100% accurate, they will become even more powerful weapons of discrimination,” Jakubowska said.
According to her, biometric mass surveillance is “inherently an unnecessary and disproportionate action for states or companies to take.”
The Reclaim Your Face campaign is made up of 47 organizations spanning 16 European countries.
Campaigners are attempting to gather 1 million signatures. If they manage to do this by May 1, 2022, the European Commission will be obliged to release a formal Communication (soft law) about their demand, to meet with the campaigners to discuss it, and potentially to ask Parliament to hold a debate on the issue.
“Essentially, it means that the issue will be put firmly on the EU’s agenda, and we will be able to show that Europeans do not accept biometric mass surveillance in our streets, our public spaces, our lives,” Jakubowska said.
The EU already prohibits biometric mass surveillance (GDPR, Data Protection Law Enforcement Directive, ePrivacy, Charter of Fundamental Rights). Yet, according to human rights activists, it provides loopholes for permitting the processing of biometric data at a national level in a way that violates the essence of the prohibition.
“This is why we need a new law at an EU level that will close the loopholes and remove the grey areas of which the Member States, EU bodies, and corporations are all taking advantage, leading to practices that violate fundamental rights,” Jakubowska told CyberNews.
More great CyberNews stories:
Subscribe to our monthly newsletter