A proposed EU law to regulate artificial intelligence (AI) does not go nearly far enough towards protecting the public from biometric surveillance (BMS), a human rights watchdog has warned.
“Despite the lawmakers’ bravado, the AI Act will not ban the vast majority of dangerous BMS practices,” said digital campaigner Reclaim Your Face. “Instead, it will introduce – for the first time in the EU – conditions on how to use these systems.”
EU parliamentarians are due to vote on the final version of the bill this spring, which would mean it being passed into law in the 27-member bloc if it were approved.
“The EU is making history – for the wrong reasons,” said Reclaim Your Face, adding that of the suggested legal clauses to ban BMS, “very little survived the AI Act negotiations.”
“Restrictions on the use of live and retrospective facial recognition in the AI Act will be minimal, and will not apply to private companies or administrative authorities,” it said.
Describing BMS practices as “error-prone and risky by design,” as well as inherently anti-democratic, Reclaim Your Face added: “Police and public authorities already have so much information about each of us at their fingertips – they do not need to be able to identify and profile us all of the time, objectifying our faces and bodies at the push of a button.”
Yesterday’s statement from Reclaim Your Face comes hard on the heels of this week’s commentary by Amnesty International, which said that misuse of AI-driven technologies such as BMS had already been shown to have “devastating” consequences for ethnic minorities.
“Whilst AI developments do present new opportunities and benefits, we must not ignore the documented dangers posed by AI tools when they are used as a means of societal control, mass surveillance, and discrimination,” said Amnesty.
Citing predictive policing tools, automated systems used in public sector decision-making over healthcare access, and monitoring the movement of immigrants and refugees, it added: “All too often, AI systems are trained on massive amounts of private and public data – data which reflects societal injustices, often leading to biased outcomes and exacerbating inequalities.”
“Based on what we have seen of the final text, the AI Act is set to be a missed opportunity to protect civil liberties,” said Reclaim Your Face. “Our rights to attend a protest, to access reproductive healthcare, or even to sit on a bench could still be jeopardized by pervasive biometric surveillance.”
“AI has flagrantly and consistently undermined the human rights of the most marginalized in society,” said Amnesty. “Other forms of AI, such as fraud detection algorithms, have also disproportionately impacted ethnic minorities, who have endured devastating financial problems.”
Your email address will not be published. Required fields are markedmarked