© 2023 CyberNews- Latest tech news,
product reviews, and analyses.

Biometrics will create digital poorhouse, says study


Biometric technology is being used against people living with disabilities to deny them access to state support, research from tech thinktank Access Now suggests.

Biometric technologies, which are used to track people based on their speech patterns, facial expressions, or even how they walk, are predicated on false assumptions, equality campaigners say. Instead of being “neutral, fair, and scientifically irreproachable,” they’re actually excluding some disadvantaged people from vital services.

“Every biometric technology has pre-envisioned a normative, or ‘correct’ idea of a body, and subsequently attempts to fit all bodies that interact with it into this frame,” asserts the study commissioned by Access Now and co-authored by Xiaowei Wang and Shazeda Ahmed of the University of California, Los Angeles.

“For all of their promise of opening up new possibilities for humans to flourish, what these technologies do instead is the reverse: collapse the varied experiences of human bodies into one single template, and predicate access to resources on whether or not one matches that template,” it adds.

The consequences of this can be severe, as biometric profiling is increasingly used to screen applicants for welfare benefits and, the report strongly implies, deny access to them for economic or political reasons. This is giving rise to a regime of tech-optimized austerity that can be best described as a “digital poorhouse.”

Fake panic, real suffering

“Biometric identification systems are implemented by governments in state-supported welfare for the intended purposes of countering purported welfare fraud,” says the report, despite “many studies demonstrating that true ‘welfare fraud’ is rare and, more often than not, a morally constructed panic.”

With biometric technology increasingly being used to “gatekeep” access to benefits, its advocates and developers, as well as governments, must take action to reverse the trend. For example, by consulting people living with disabilities earlier on in development stages to make the technology more compatible with their needs.

“Advocates, but also tech companies and regulators, must be aware of the use of biometrics not just within settings that claim to extend experience, but also in settings where biometric technologies are used to gate-keep, contain inequity, or potentially deepen asymmetrical power dynamics between states and communities,” it says.

“This awareness is crucial particularly under fiscal austerity, as states use technologies to replace or augment labor shortages in government services, or to cut costs within state programs – a form of what [Goldsmiths University academic] Dan McQuillan terms ‘optimizing austerity.’ Automation and automated decision-making systems used to determine allocation of social benefits have been shown to create a ‘digital poorhouse,’ leaving people trapped in cycles of inequity.”

Access Now cites as an example digital so-called fraud detection systems deployed in recent years within welfare assessment systems in the Netherlands and Denmark, “at times with devastating results.”

In the Netherlands, this led to “false positives [that] impacted tens of thousands of families, leading to unpayable tax bills, children being removed from their families, people losing their homes, and even suicides.”

Access Now concludes that the deployment of biometric technologies is underpinned by a “societal question around who we see as ‘deserving’ of state benefits” that often leads to a “criminalization of poverty or migrant status.”

Tech companies, on the other hand, stand to benefit from this socio-political landscape, with such firms competing for funding not only from private-sector investors but also governments who view such “upswell in research, development, and investment [...] as a net positive for the economy.”

Enter function creep

The lucrative nature of this flourishing market could further give rise to what Access Now calls “function creep.” This is where biometric technology, despite having dubious credentials, is applied to areas not originally intended for its use, as the companies behind it seek ever greater profits.

“Researchers have shown that the biometric technology market is highly lucrative, with fierce competition and pressure to constantly patent new technologies – resulting in outsize industry claims around expertise and what technology can do,” it says. “These incentives can drive function creep, particularly given claims around veracity and expertise that go unscrutinized.”

This insidious phenomenon could entail a scanner initially intended to help diagnose, say, sufferers of Parkison’s Disease being adapted for use in lie-detector tests, or one designed to detect anxiety disorders for medical purposes being deployed in a border-control situation.

“It is worth restating that biometric technologies in the field are consistently shown to require human interpretation, despite claims that the technology is objective, neutral, or will require no human interpretation,” says Access Now. “Among our interviewees, some expressed concerns around how technology developed in one setting would, given claims around veracity, be used in another context.”

It adds: “The lack of clarity around the boundaries of what is considered biometric data, as well as what is considered a biometric system, has not only created ambiguous regulations, but also potential for irreversible harm embedded in the market incentives set up for biometric technologies. Of enormous concern is the continued, clear lack of regulation around governance of funding and technological benefits, as well as the adjudication of harms.”


More from Cybernews:

I tried to revoke all Android app permissions but it was impossible

X usage significantly down under Elon Musk

Scientists find a way to make roads on the Moon – and it’s simple

Apple Watch screen glitches frustrates users

Deepfaked African Union chief called European leaders

Subscribe to our newsletter



Leave a Reply

Your email address will not be published. Required fields are markedmarked