Portland recently passed unprecedented restrictions on facial recognition technology. The landmark move prohibits public and private use from using surveillance tech. Portland has taken a much stricter approach than other US cities such as San Francisco, Boston, and Oakland, California, as the techlash gathers pace.
Portland Mayor Ted Wheeler proudly stated that “all Portlanders are entitled to a city government that will not use technology with demonstrated racial and gender biases that endanger personal privacy.” Despite the promises of software vendors, face-rec technology is still in its infancy and has severe limitations when it comes to gender and ethnic identification factors. The tech responsible for misidentification and giving false positives is further fuelling unrest and division in communities.
Why tech companies need to stop moving fast and breaking things
Silicon Valley was built on a culture of proudly breaking things and not playing by the rules. But in doing so, they have irresponsibly created future problems that need fixing today. Tech giants are beginning to look nervous as their creation gets out of control and used for nefarious purposes.
Regulators and lawmakers cannot keep up with the pace of technological change. Sundar Pichai, chief executive of Alphabet and Google, wrote in the Financial Times that AI needs to be regulated while stating that “Companies cannot just build new technology and let market forces decide how it will be used.”
“IBM firmly opposes and will not condone uses of any technology, including facial recognition technology offered by other vendors, for mass surveillance, racial profiling, violations of basic human rights and freedoms, or any purpose which is not consistent with our values and Principles of Trust and Transparency.”IBM CEO Arvind Krishna
Facial recognition software is increasingly being used to identify individuals in photos or videos and cross-referenced with a database of known subjects. But from the moment that you walk into a store, you could also be subject to surveillance to determine if you are a known shoplifter or if you are old enough to buy the alcohol in your basket.
The relentless encroachment of technology in almost every aspect of our lives is beginning to set off a few alarm bells. How many photos have you uploaded to social channels or been tagged in online? There is increasing evidence to suggest that your selfies could be used to train facial recognition AI without your consent.
New face search engines such as PimEyes already enable users to search for other photos of a person online. As algorithms improve, things quickly begin to enter the creepy territory. ClearView AI is one company that has hit the headlines for all the wrong reasons. The service scrapes your social media photos and enables law enforcement to search and match images using facial recognition technology.
The motto, if you’ve got nothing to hide, you’ve got nothing to fear, is beginning to look outdated as technology stands accused of work against users, rather than empowering them.
”The premise [is] that privacy is about hiding a wrong. It’s not. Privacy is an inherent human right and a requirement for maintaining the human condition with dignity and respect.”Bruce Schneier, computer security and privacy specialist
Even if you’ve got nothing to hide, you’ve still got everything to fear
China infamously uses facial recognition to control human behavior. A database leak from a vast network of cameras revealed that 6.8 million records were captured and recorded in just one day. When your every move and the friends you hang out with are under scrutiny, it’s easy to see why many believe we are sleepwalking into an Orwellian nightmare where Face-mask recognition and encouraging citizens to report their neighbor is branded as the new normal.
In the beginning, facial recognition was sold to society as a way to curb crime and keep citizens safe. It has since been used to stop the rise of protest movements and crush dissent. But Surveillance tech is now being used to discourage uncivilized behavior such as shaming those that wear pajamas outside or caught Jaywalking. Welcome to the world of behavioral engineering.
How clean is your permanent record?
Most people reading this would admit to having ridden a bicycle in the improper lane, accidentally put the recyclables in the trash, or ran out of the house without your facemask. If facial recognition algorithms were able to enforce every law, we would all risk becoming criminals.
Big Tech often says the right things about retreating from facial recognition. However, the reality is that many are still scraping your selfies from social media platforms to improve their facial recognition algorithms and will predictably sell them to the highest bidder.
Portland’s facial recognition ban should set a new standard for others to follow in an ideal world.
But I fear that Pandora’s box has been opened and there is no going back. Regulation and lawmakers cannot keep the pace in a tech fuelled world where temperature checks, mask-detection, and surveillance have become the norm within a few months.
Facial recognition technology is an excellent example of how the COVID-19 has accelerated digital trends. The social media platforms that keep us endlessly scrolling have been building a permanent record of every click, swipe, and digital interaction across every device. Facial recognition now brings a human face to your digital file.
The cruel irony is that our selfies have been unwittingly training algorithms and helping to power the technology that could eventually be used to monitor our every move.