Facial recognition tech misidentifies woman as shoplifter

A woman recently went to the UK's Home Bargains store to buy chocolate, but just after entering the shop, a store worker approached her and accused her of being a thief.

This was because FRT software, called Facewatch, labeled her a criminal. As a result, the woman was banned from all stores using the software.

She took the incident emotionally.

"I thought, 'Oh, will my life be the same? I'm going to be looked at as a shoplifter when I've never stolen," the woman told BBC.

Later, it turned out that she had been mistakenly identified as a shoplifter. Facewatch contacted the woman and acknowledged that it had made an error.

This and similar instances cast a shadow over the use of the technology, which is also causing concerns about mass surveillance.

However, advocates of FRT point out the benefits. For police, it helps to identify criminals in a matter of seconds.

This year, local police in London made 192 arrests with the help of the technology, BBC reported.

Local police attach cameras on a modified van's roof and capture thousands of images of people's faces. The technology creates a biometric image of each face, assesses it against the watchlist, and deletes it if there’s no match.

However, during the day that a journalist patrolled with police, it misidentified one person as a criminal, possibly due to family resemblance.

German police have also recently deployed real-time face recognition in the German eastern State of Saxony and in the city of Berlin.

In the EU, real-time facial recognition in public places is banned except in cases when law enforcement is dealing with serious crimes or searching for missing people.

Facial recognition is also increasingly used in the air travel industry. For airports and airlines, it speeds up security and helps identify potential threats.

By the next year, the technology will be deployed to about 200 airports – nearly half of the airports in the United States, the Transportation Security Administration estimates.

Privacy advocates, such as the organization Fight for the Future, stress that, apart from risks of FRT being used as a tool for mass surveillance, there are risks of mishandling data. This is because, in airports, FRT is being deployed by different entities with different policies on how the data is stored and shared.

Nevertheless, industry experts say that all issues will be addressed.