Buying a coffee, crossing a street: mundane and everyday activities, during which faces can be scanned without knowledge or consent in the UK, where police forces are using facial recognition technology (FRT) to identify suspects. Police may argue ‘nothing to hide, nothing to fear’, but how can it be claimed that any surveillance technology is only reactive to crime, and not also a Panopticon-esque method of controlling behaviour[1]? Shouldn’t this itself be controlled or even banned?
FRT is an insidious menace, looming over our individual privacy, free expression, information security, and social justice. Let's rewind to 2015, when the Leicestershire Police scanned the faces of 90,000 unsuspecting festival-goers in the UK, checking them against a European watchlist of criminal suspects[2]. This marked the first known deployment of Live Facial Recognition (LFR) at an outdoor public event in the UK[2]. Since then, this invasive surveillance technology has been stealthily creeping into our daily lives, with little government oversight or electoral mandate.
Chillingly, six police forces across the UK have now employed Retroactive Facial Recognition (RFR)[3], scanning faces in still images or previously captured video footage and comparing them to watchlists. This means any photo or video accessible to the police - including your social media posts - can be weaponised as surveillance[3].
Under GDPR, we are asked for consent when a website wants to collect our personal data or track our activity. If our habits are protected from Amazon's prying eyes, why are our faces nonconsensually up for grabs? The lack of accountability is outrageous.
This flagrant violation of our privacy rights is only exacerbated by unchecked data sharing between police forces. Take the disturbing account[4] of how the Met and British Transport Police secretly shared images of seven individuals for facial recognition use between 2016 and 2018. Despite a series of intrusive privacy violations, the entire operation was apparently futile; the Met conceded[5] they had not kept records of whether the FRT successfully identified any of the seven individuals whose images were shared, let alone if any police action followed a match.
Matches themselves have been thin on the ground. After six years of using FRT, with data from 179 countries, INTERPOL identified only 1,500 individuals of note[6]. Alongside criminals this included ‘missing persons’ and ‘persons of interest’ who by no means can be assumed to pose a threat[6].
Amongst this, citizens cannot resist or prevent FRT. A troubling case from 2019 saw a man slapped with a £90 fine for disorderly behaviour after he chose to cover his face while walking past a surveillance van involved in a trial[7]. Why did the police stop him? Apparently, opting to conceal your identity now becomes a valid reason for police intervention.
But wait a minute - weren't the UK's infamous 'stop and search' laws, which disproportionately targeted young black men and sparked riots, outlawed back in 1981[8]? Are we witnessing history repeating itself? Is the widespread, unchecked use of FRT dragging us back to the dark days of the vilified 'sus laws'?
Undeniably, an individual with a clean criminal record is not guaranteed to maintain one. FRT could in theory catch those at risk of harming others before behaviour escalates, particularly if used as a deterrent. Whereas FRT erodes freedom in many senses, for the wider public, feeling safe from threat may be a form of freedom in itself, and one that some would argue takes precedence. A poll earlier this year on community safety put crime and anti-social behaviour at the top of public concerns, with one respondent desiring ‘being able to walk the streets in safety…’ - which certainly sounds like freedom[9].
Indeed the counter argument from the police, backed by the government, is facial recognition helps to tackle crime[2]. However, this argument is watertight only if FRT cameras can accurately identify individuals - a feat they are currently far from achieving. The technology has an appalling track record of misidentifying people of colour, women, and even leading to wrongful arrests[10]. Imagine being mistaken for a criminal. Your freedom to work, travel, and access credit could be jeopardised. The consequences are serious and far-reaching.
FRT also fails to correctly identify trans and nonbinary individuals[11]. When researchers put facial recognition systems from major tech players like IBM, Amazon, Microsoft, and Clarifai to the test, they discovered a staggering 38% misidentification rate of trans men as women[12]. Astoundingly, the software failed to recognise nonbinary, agender, or genderqueer individuals 100% of the time[12]. This disproportionate rate of misidentification for these minority groups not only hints at discrimination, but also risks fueling unjust interrogations and arrests among these very populations. Regrettably, police forces face little responsibility for these injustices[10]. Not a very safe environment by any stretch.
In addition, the disparity between these findings and police claims about reliability is suspicious. While the Met said the algorithm used had improved hugely in its accuracy with a success rate of 70%, an external auditor said it was 19%[13]. In a landmark case in 2020, the Court of Appeal refused to accept facial recognition by South Wales Police as evidence citing the European Convention on Human Rights[14]. If a court will not accept the image data as evidence, you have to wonder why police forces are still using it. Despite this, there is currently no UK law specifically regulating the use of facial recognition cameras[15].
Unfortunately, unchecked proliferation of this perilous technology has bypassed critical oversight from Parliament. Police forces continue to make unilateral decisions, from choosing to adopt the technology to determining what safeguards to put in place. In 2022, the UK Government dismissed a House of Lords report urging the introduction of regulations and mandatory training to mitigate the adverse effects of surveillance technologies on human rights and the rule of law[16].
This lack of action could be attributed to a mismatch between the speeds of technological advance and bureaucracy[17], competing stakeholder concerns or corporate hands[18], or complications in reconciling any use of FRT with basic data protection standards[19]. However, other countries face the same conundrums, and the draft EU AI Act bans both real-time FRT and its use by law enforcement officers[20], whilst the European Parliament recently voted for a non-binding resolution banning police from using FRT in public places[21].
In 2019, San Francisco became the first city in the US to ban the use of FRT by police[22], citing oppressive risk outweighing benefits[23[. The bold move is testament to a city prioritising the protection of its citizens from potential government misuse. Critics and civil liberty groups such as Liberty are in agreement: extensive surveillance is incompatible with a healthy democracy[24].
This signals the problem in the UK may be down to political will rather than practicality concerns[25]. It is not entirely surprising, considering that London is the world's most surveilled city outside of China[26]. Compounding this, researchers warn FRT's ‘chilling effect’ may discourage individuals from exercising basic rights, like protesting, out of fear for potential repercussions[10].
In the end, it is clear rather than protecting people's security, FRT raises serious concerns about privacy intrusion, power abuse, and discrimination against minority groups. Even if FRT accuracy were improved, the trade-off is abysmal; how could these deep and fundamental problems be reconciled? Perhaps only if we diluted our notion of human rights.
Given the risks, the only viable solution is to ban the police use of FRT in the UK. Now is the time for us to rally against this Orwellian menace, and gather unequivocal public opposition.
Your local MP represents you and is duty bound to advocate for your voice. Contact them, interact with advocacy groups, attend public consultations, and sign petitions. Only by taking action together can we put an end to this invasive practice and protect our society’s fundamental rights.
Take action: Contact your MP; Liberty Human Rights petition; Stop the Met Police using facial recognition surveillance.
(1320 words)
Reference list:
1. Galič, Maša, Tjerk Timan, and Bert-Jaap Koops. “Bentham, Deleuze and beyond: An Overview of Surveillance Theories from the Panopticon to Participation.” Philosophy & Technology 30, no. 1 (2016): 9–37. https://doi.org/10.1007/s13347-016-0219-1.
2. Guariglia, Paige Collings and Matthew. “Ban Government Use of Face Recognition in the UK.” Electronic Frontier Foundation, September 26, 2022. https://www.eff.org/deeplinks/2022/09/ban-government-use-face-recognition-uk.
3. Skelton, Sebastian Klovig. “Met Police Purchase New Retrospective Facial-Recognition System: Computer Weekly.” ComputerWeekly.com. ComputerWeekly.com, October 1, 2021. https://www.computerweekly.com/news/252507569/Met-Police-purchase-new-retrospective-facial-recognition-system.
4. Sabbagh, Dan. “Facial Recognition Row: Police Gave King's Cross Owner Images of Seven People.” The Guardian. Guardian News and Media, October 4, 2019. https://www.theguardian.com/technology/2019/oct/04/facial-recognition-row-police-gave-kings-cross-owner-images-seven-people.
5. “Report to the Mayor of London.” Accessed May 7, 2023. https://www.london.gov.uk/sites/default/files/040910_letter_to_unmesh_desai_am_report_re_kings_cross_data_sharing.pdf.
6. “Facial Recognition.” INTERPOL. Accessed May 7, 2023. https://www.interpol.int/en/How-we-work/Forensics/Facial-Recognition.
7. Dearden, Lizzie. “Man Fined £90 after Covering Face during Facial Recognition Trial in London.” The Independent. Independent Digital News and Media, February 1, 2019. https://www.independent.co.uk/news/uk/crime/facial-recognition-cameras-technology-london-trial-met-police-face-cover-man-fined-a8756936.html.
8. Thomas, Leslie. “The Brixton Riots: Policing the Black Community in the Last 40 Years ...,” February 3, 2022. https://www.gresham.ac.uk/sites/default/files/transcript/2022-02-03-1800_THOMAS-T.pdf.
9. “Public Polling on Community Safety.” GOV.UK. Accessed May 7, 2023. https://www.gov.uk/government/publications/public-polling-on-community-safety/public-polling-on-community-safety.
10. “UK Police Fail to Meet 'Legal and Ethical Standards' in Use of Facial Recognition.” University of Cambridge, October 27, 2022. https://www.cam.ac.uk/research/news/uk-police-fail-to-meet-legal-and-ethical-standards-in-use-of-facial-recognition.
11. Schwartz, Adam. “Resisting the Menace of Face Recognition.” Electronic Frontier Foundation, November 17, 2021. https://www.eff.org/deeplinks/2021/10/resisting-menace-face-recognition.
12. Millar, Molly. “Facial Recognition Technology Struggles to See Past Gender Binary.” Reuters. Thomson Reuters, October 30, 2019. https://www.reuters.com/article/us-usa-lgbt-facial-recognition/facial-recognition-technology-struggles-to-see-past-gender-binary-idUSKBN1X92OD.
13. “UK Police Use of Live Facial Recognition Unlawful and Unethical, Report Finds.” The Guardian. Guardian News and Media, October 27, 2022. https://www.theguardian.com/technology/2022/oct/27/live-facial-recognition-police-study-uk#:~:text=Facial%20recognition%20technology%20measures%20dozens%20of%20distinguishable%20features%20on%20the%20face&text=Inside%20UK%20law%20enforcement%20LFR,an%20individual%20and%20track%20them.
14. Dearden, Lizzie. “Facial Recognition Has Been Used Unlawfully and Violated Human Rights, Court of Appeal Rules in Landmark Case.” The Independent. Independent Digital News and Media, August 13, 2020. https://www.independent.co.uk/news/uk/home-news/facial-recognition-unlawful-violation-human-rights-court-of-appeal-a9664441.html.
15. Porter, Tony. “What's Right and Wrong with Facial Recognition.” Biometric Technology Today 2021, no. 4 (2021): 5–7. https://doi.org/10.1016/s0969-4765(21)00047-3.
16. Loe, Molly. “Digital Surveillance Might Not Be the Answer for Smarter Cities.” TechHQ, February 9, 2023. https://techhq.com/2023/02/the-dangers-of-digital-surveillance/.
17. Faraldo Cabana, Patricia. “Technical and Legal Challenges of the Use of Automated Facial Recognition Technologies for Law Enforcement and Forensic Purposes.” Artificial Intelligence, Social Harms and Human Rights, 2023, 35–54. https://doi.org/10.1007/978-3-031-19149-7_2.
18. Sarabdeen, Jawahitha. “Protection of the Rights of the Individual When Using Facial Recognition Technology.” Heliyon 8, no. 3 (2022). https://doi.org/10.1016/j.heliyon.2022.e09086.
19. “What Is Police Facial Recognition and How Do We Stop It?” Liberty, August 11, 2022. https://www.libertyhumanrights.org.uk/issue/what-is-police-facial-recognition-and-how-do-we-stop-it/.
20. Madiega, Tambiama, and Hendrik Mildebrath. “Regulating Facial Recognition in the EU - European Parliament,” September 2021. https://www.europarl.europa.eu/RegData/etudes/IDAN/2021/698021/EPRS_IDA(2021)698021_EN.pdf.
21. Heikkilä, Melissa. “European Parliament Calls for a Ban on Facial Recognition.” POLITICO. POLITICO, October 8, 2021. https://www.politico.eu/article/european-parliament-ban-facial-recognition-brussels/.
22. Conger, Kate, Richard Fausset, and Serge F. Kovaleski. “San Francisco Bans Facial Recognition Technology.” The New York Times. The New York Times, May 14, 2019. https://www.nytimes.com/2019/05/14/us/facial-recognition-ban-san-francisco.html#:~:text=San%20Francisco%20banned%20the%20use,on%20the%20streets%2C%20in%20parks.
23. “Why San Francisco Banned the Use of Facial-Recognition Technology.” The Economist. The Economist Newspaper, May 16, 2019. https://www.economist.com/democracy-in-america/2019/05/16/why-san-francisco-banned-the-use-of-facial-recognition-technology.
24. “Facial Recognition.” Liberty, February 2, 2023. https://www.libertyhumanrights.org.uk/fundamental/facial-recognition/.
25. Jakubowska, Ella. “Remote Biometric Identification: A Technical & Legal Guide.” European Digital Rights (EDRi), January 24, 2023. https://edri.org/our-work/remote-biometric-identification-a-technical-legal-guide/.
26. Ingham, Lucy. “Outside China, London Is the Most Surveilled City in the World.” Verdict, August 19, 2019. https://www.verdict.co.uk/most-surveilled-city/.
Comments