Clearview AI: what has the legal battle taught us about the nature of facial recognition?
A yearlong legal complaint over the data harvesting practices of American facial recognition company Clearview AI has ended in a settlement that both sides say vindicates their point of view. Having been fined in France, Italy, the United Kingdom, the United States, and Australia, it has opened up a never-ending debate of security versus privacy.
In May 2020, the American Civil Liberties Union (ACLU) filed a lawsuit against Clearview AI in an Illinois courtroom on behalf of groups representing survivors of domestic violence and sexual assault, undocumented immigrants, current and former sex workers, and other vulnerable communities uniquely harmed by face recognition surveillance.
The ACLU claimed in its suit that Clearview AI was in violation of Illinois’ Biometric Information Privacy Act (BIPA) by allowing access to state residents’ images and biometrics data on a national database used by law enforcement and government.
An enormous database of faces
Clearview AI has built up a reputation for having huge, searchable databases that are sold to organizations that need to identify those in conjunction with criminal investigations. It uses 20 billion publicly available images, which it claims is the largest such database in the world.
The ACLU’s case claimed that was illegal, and in a settlement reached in an Illinois court, the ACLU claims that Clearview AI agreed to a settlement that restricts the company from selling its faceprint database not just in Illinois but across the United States. In addition, “Clearview is permanently banned, nationwide, from making its faceprint database available to most businesses and other private entities.”
Clearview AI has previously been subject to similar complaints in multiple European countries.
Both sides claim victory
“By requiring Clearview to comply with Illinois’ pathbreaking biometric privacy law not just in the state, but across the country, this settlement demonstrates that strong privacy laws can provide real protections against abuse,” says Nathan Freed Wessler, a deputy director of the ACLU Speech, Privacy, and Technology Project. “Clearview can no longer treat people’s unique biometric identifiers as an unrestricted source of profit. Other companies would be wise to take note, and other states should follow Illinois’ lead in enacting strong biometric privacy laws.”
For its part, Clearview AI claims that the settlement “confirms the company’s compliance” with Illinois’ privacy laws and won’t change its business practices, which involve giving access to search engines to law enforcement and government agencies. “The court’s endorsement of the BIPA settlement is an achievement for Clearview AI’s customers and our mission of providing justice to victims of crime across the country,” says Hoan Ton-That, Clearview AI’s CEO, in a statement. “Clearview AI intends to serve private sector clients with product offerings that are not affected by this agreement, focused on our core mission of enhancing security.”
In this way, Clearview AI positions itself as a guardian of personal security – overstepping the privacy boundaries in favor of law enforcement. In 2020, it announced that it would stop working with non–law enforcement agencies and private organizations.
No liability admitted
The settlement was not dependent on Clearview AI admitting any liability to the claims made by the ACLU. The company paid the ACLU $250,000 in attorney fees but has not paid damages. “Today, facial recognition is used to unlock your phone, verify your identity, board an airplane, access a building, and even for payments. This settlement does not preclude Clearview AI selling its bias-free algorithm, without its database, to commercial entities, which is fully compliant with BIPA," Ton-That adds.
Indeed, Clearview AI plans to have more than 100 billion faceprints in its database within 12 months, something it claims will enable it to ensure “almost everyone in the world will be identifiable.” But one place you can live where you might not be identifiable in the future? Illinois.
As part of the settlement, Clearview AI agreed over the next five years to continue its current measures to attempt to filter out photographs that were taken in or uploaded from the state. And the ACLU hopes that’ll spread beyond the state’s borders. “BIPA was intended to curb exactly the kind of broad-based surveillance that Clearview’s app enables,” says Rebecca Glenberg, staff attorney for the ACLU of Illinois. “Today’s agreement begins to ensure that Clearview complies with the law. This should be a strong signal to other state legislatures to adopt similar statutes.”
More from Cybernews:
Subscribe to our newsletter