Facial Recognition: A Matter of Life or Death?

An Article by Polina Hristova, journalist at ROCCO.

It’s time to dust off that fake moustache and Clark Kent glasses because facial recognition will no longer just serve as a method to unlock your phone or computer – it will become a widespread surveillance method even in the West. China has been using AI fuelled facial recognition technology for a while now: the police use sunglasses to scan faces and street cameras are also equipped with the technology under the pretence of tighter security measures and crime control, but the reality of invasive tech is a lot grimmer. China is also known to use facial recognition to track millions of Muslim citizens who may or may not mysteriously disappear under inexplicable circumstances and are suspected to be committed to hidden camps against their will.

Many would reject the idea of something similar happening in European Union or the US in the 21stcentury, but the truth is that you do not need to be locked up to have your privacy invaded and your data abused right under your nose. Social media has turned us into numbers, raw data, statistics to be sold and resold without our knowledge or approval. Everything ‘free’ requires a data transaction, a piece of our identity to be absorbed into the robust world of advertising and beyond until we reach a state of total dehumanisation. Cue in the age of mass surveillance and the “detention centres” which will hold the suspects – and oh, they will be plenty

Let me explain why.

Experts and prominent academic AI researchers from the American Civil Liberties Union (ACLU) identified a major problem with Amazon’s Rekognition software after it had falsely matched 28 members of Congress to a pool of 25 000 mugshot photos, but not only that –  it mostly impacted people of colour. Amazon has sold its facial recognition technology to law enforcement and the higher error rates for dark-skinned and female individuals show a serious racial bias. ACLU has since publicly urged Amazon to stop selling Rekognition to authorities. A researcher from the Massachusetts Institute of Technology (MIT), Joy Buolamwini, was the first to conduct a study on the topic earlier this year. Amazon dismissed the claim pointing out that the researchers failed to use Rekognition properly:

“While 80% confidence is an acceptable threshold for photos of hot dogs, chairs, animals, or other social media use cases, it wouldn’t be appropriate for identifying individuals with a reasonable level of certainty. When using facial recognition for law enforcement activities, we guide customers to set a higher threshold of at least 95% or higher.”

But it turns out that the police also didn’t get Amazon’s user guide.

“There needs to be a choice,” said Buolamwini. “Right now, what’s happening is these technologies are being deployed widely without oversight, oftentimes covertly, so that by the time we wake up, it’s almost too late.”

Amazon’s Rekognition system is sold as part of Amazon’s Web Services cloud offering and is extremely inexpensive with a price tag of less than $12 per month for an entire department

“Serious concerns have been raised about the dangers facial recognition can pose to privacy and civil rights,” an open letter from Sen. Markey (D-MA), Rep. Guitiérrez (D-IL) and Rep. DeSaulnier (D-CA) to Amazon CEO Jeff Bezos reads, “especially when it is used as a tool of government surveillance.”

This isn’t the only case of a visual system failing to recognise people of colour: one recent study of vision systems in self-driving cars demonstrates a struggle when it comes to identifying pedestrians with darker skin tones.

And while facial recognition helped the British police identify two Russian suspects in  the Novichok poisoning, it has also equivocally placed suspicion on a young Brazilian man living in London of being a suicide bomber resulting in an innocent’s death. It’s a double-edged sword.

Misidentification and possible arrest lie in the premature use of young technology and our blind reliance on it, and without proper scrutiny or public debate, every public space could be turned into a biometric check point. Facial recognition performs well in controlled environments (airports, borders), but it would fail to accurately recognise people out and about, bringing forth a plethora of issues which would ultimately stifle a common citizen’s sense of security in their own city or country. According this data, 98% of the matches made by the Metropolitan’s AFR system are mistakes.

Fingerprint sensors in phones, behavioural biometrics used by banks, ear biometrics– where is this data stored and what is it being used for when we’re not unlocking our devices or verifying our identity?