ARTICLE 19 has joined 26 human rights, racial justice, migrants’ rights and civil liberties organisations in calling on the United Kingdom’s Prime Minister Keir Starmer to reconsider plans to expand the use of facial recognition surveillance in response to recent public disorder across the country.
In a letter dated 9 August 2024 sent via email, the groups point out that the use of live facial recognition surveillance would not make the UK safer, and state that, on the contrary, it puts the rights and the democratic health of the country more at risk.
The letter also asks Starmer to meet with the organisations to ‘discuss the rights and equalities impacts of this AI mass surveillance’.
The letter follows.
9 August 2024
The Prime Minister Rt Hon Sir Keir Starmer MP
10 Downing Street
Westminster
London
SW1A 2AA
cc: Rt Hon Yvette Cooper MP
Secretary of State for the Home Department
By email only
Dear Prime Minister,
We are writing to you as a coalition of human rights, racial justice, migrants’ rights and civil liberties groups following your statement on the recent scenes of public disorder across the country and plans to introduce a ‘wider deployment of facial recognition technology’.1https://www.gov.uk/government/speeches/prime-minister-keir-starmers-statement-in-downing-street-1-august We are alarmed by the horrendous scenes of violence and chilling Islamophobic and racist attacks that have been taking place, which jeopardise the safety not only of those directly attacked but of marginalised communities across the country. Whilst we urge you to take robust action to stop the violence, protect our communities and bring those responsible for this criminal behaviour to justice, we have serious concerns regarding the use of facial recognition surveillance and urge you to drop any plans to expand police use of live facial recognition surveillance in particular.
When used in the context of policing, facial recognition technology (FRT) is a highly controversial form of biometric surveillance that currently faces restrictions and blanket bans in cities, states and countries in the United States and Europe2See, for example the outright bans in Belgium and Luxembourg and US State legislation such as the Illinois Biometric Information Privacy Act and Texas Biometric Privacy Law. Montana law restricting facial recognition use by police, public agencies takes effect – Chris Burt, Biometrics Update, 5th July 2023; San Francisco is first US city to ban facial recognition – BBC News, 15 th May 2019: https://www.bbc.co.uk/news/technology-48276660, due to the serious threat it poses to privacy, freedom of expression and freedom of assembly. Live facial recognition (LFR) cameras subject thousands of passers-by to unwarranted biometric identity checks and invert the democratic principle of the presumption of innocence, by scanning and comparing the faces of members of the public en masse against a police watchlist. Despite the significant police resources expended by each LFR deployment, LFR continues to have issues with accuracy,3To date, South Wales Police and the Metropolitan Police’s use of LFR have produced 75% incorrect matches since the technology was first introduced. bias and discrimination4The Biometrics and Forensics Ethics Group warned that UK police’s use of LFR technology has the “potential for biased outputs and biased decision-making on the part of system operators.” See Biometrics and Forensics Ethics Group, Interim report, February 2019. On the day of your announcement to expand the police’s use of facial recognition surveillance, the EU’s AI Act came into force which broadly prohibits live facial recognition.5Artificial Intelligence Act: deal on comprehensive rules for trustworthy AI – European Parliament, 9th December 2023: https://www.europarl.europa.eu/news/en/press-room/20231206IPR15699/artificial-intelligence-act-dealon- comprehensive-rules-for-trustworthy-ai Should UK police forces expandthe use of live facial recognition under your leadership, it would make our country an outlier in the democratic world.
It is notable that there is no explicit legal basis for FRT use by the police and it has never been debated by Parliament. The law governing authorities’ uses of facial biometrics is wholly inadequate, as identified by Matthew Ryder KC’s review (the Ryder Review),6https://www.adalovelaceinstitute.org/project/ryder-review-biometrics/ and individuals’ rights to privacy, free expression and freedom of assembly are threatened by the use of LFR inparticular. As you outlined in your address on 1st August, in times of crisis, upholding the rule of law is paramount – however, live facial recognition operates in a legal and democratic vacuum, and it is our view that its use for public surveillance is not compatible with the European Convention on Human Rights. As a public body, the police are under a duty to uphold human rights as outlined by Section 6 of the Human Rights Act. The legality of the Metropolitan Police
Service’s use of LFR is currently subject to a legal challenge by an anti-knife crime community worker in London, Shaun Thompson, after he was misidentified by the technology and subject to wrongful police questioning and threats in the London Bridge area earlier this year.7BBC, ‘I was misidentified as a shoplifter by facial recognition tech,’ https://www.bbc.co.uk/news/technology- 69055945 In Mr Thompson’s words, ‘instead of working to get knives off the streets like I do, police were wasting their time with technology (that) had made a mistake’. In the wake of the shocking stabbings in Southport, we believe these words bear very strong significance. In 2020, the Court of Appeal ruled that the South Wales Police had unlawfully deployed LFR surveillance.8Liberty, Legal Challenge: Ed Bridges v South Wales Police, https://www.libertyhumanrights.org.uk/issue/legalchallenge- ed-bridges-v-south-wales-police/
We join you in condemning the racist, violent and disorderly scenes across the country. However, to rush in the use of technology which has a seriously negative bearing on our rights and freedoms would not only fail to address the causes of this dangerous violence, but set a chilling precedent, threaten the democratic rights of the very communities you are seeking to protect, and undermine Labour’s commitment to protecting human rights9https://labour.org.uk/change/britain-reconnected/ and the UK’s legal obligation to protect and uphold human rights under international law.
Live facial recognition surveillance would not make us, or the communities we represent, feel any safer. On the contrary, it puts our rights and the democratic health of the country more at risk. We urge you to rethink your plans to expand the use of facial recognition surveillance in the UK and would ask that you meet with us, as you have with police chiefs, to discuss the rights and equalities impacts of this AI mass surveillance. We look forward to hearing from you.
Yours sincerely,
Silkie Carlo, Director, Big Brother Watch
Antonia Lee, Stop the Scan Project Co-ordinator, Racial Justice Network
Chantal Joris, Senior Legal Officer, Article 19
Christina Tanti, Head of Research, Race Equality First
Deborah Coles, Executive Director, INQUEST
Pizza Qureshi, Chief Executive Officer, Migrants’ Rights Network
Gus Hosein, Executive Director, Privacy International
Habib Kadiri, Executive Director, StopWatch
Ilyas Nagdee, Racial Justice Director, Amnesty International
Jen Persson, Director, Defend Digital Me
Sara Chitseko, Pre-Crime Programme Manager, Open Rights Group
Kevin Blowe, Campaigns Coordinator, Network for Police Monitoring (Netpol)
Liz Fekete, Director, Institute of Race Relations
Minnie Rahman, Chief Executive, Praxis, for Migrants and Refugees
Nik Williams, Policy and Campaigns Officer, Index on Censorship
Romain Lanneau, Researcher, Statewatch
Shameem Ahmad, Chief Executive Officer, Public Law Project
Stephanie Needleman, Legal Director, JUSTICE
Tracey Bignall, Director of Policy and Engagement, The Race Equality Foundation
Yasmin Halima, Executive Director, The Joint Council for the Welfare of Immigrants
Access Now
European Network Against Racism
Faz Amnesty
Northern Police Monitoring Project
No Tech for Tyrants
Revolving Doors
Street Fathers
- 1https://www.gov.uk/government/speeches/prime-minister-keir-starmers-statement-in-downing-street-1-august
- 2See, for example the outright bans in Belgium and Luxembourg and US State legislation such as the Illinois Biometric Information Privacy Act and Texas Biometric Privacy Law. Montana law restricting facial recognition use by police, public agencies takes effect – Chris Burt, Biometrics Update, 5th July 2023; San Francisco is first US city to ban facial recognition – BBC News, 15 th May 2019: https://www.bbc.co.uk/news/technology-48276660
- 3To date, South Wales Police and the Metropolitan Police’s use of LFR have produced 75% incorrect matches since the technology was first introduced.
- 4The Biometrics and Forensics Ethics Group warned that UK police’s use of LFR technology has the “potential for biased outputs and biased decision-making on the part of system operators.” See Biometrics and Forensics Ethics Group, Interim report, February 2019
- 5Artificial Intelligence Act: deal on comprehensive rules for trustworthy AI – European Parliament, 9th December 2023: https://www.europarl.europa.eu/news/en/press-room/20231206IPR15699/artificial-intelligence-act-dealon- comprehensive-rules-for-trustworthy-ai
- 6https://www.adalovelaceinstitute.org/project/ryder-review-biometrics/
- 7BBC, ‘I was misidentified as a shoplifter by facial recognition tech,’ https://www.bbc.co.uk/news/technology- 69055945
- 8Liberty, Legal Challenge: Ed Bridges v South Wales Police, https://www.libertyhumanrights.org.uk/issue/legalchallenge- ed-bridges-v-south-wales-police/
- 9https://labour.org.uk/change/britain-reconnected/