On 4th September, the Divisional Court refused the application for judicial review brought by Cardiff resident Ed Bridges.
Bridges challenged the legality of the use of AFR Locate, an application of facial recognition technology, by the South Wales Police on the basis that its use was contrary to the requirements of the Human Rights Act and Data Protection Act, and the decision to use it did not take into account the Equality Act.
Article 8 of the Human Rights Act 1998 protects the right to respect for your family and private life, your home and your correspondence. However, Article 8 is a qualified right which means that “public authority can sometimes interfere with your right to respect for private and family life if it’s in the interest of the wider community of to protect other people’s rights.”
Since April 2017, South Wales Police has been piloting automatic facial recognition technology with the pilot made up of two projects, AFR Locate and AFR Identify.
Bridges challenged the legality of AFR Locate , which takes digital images of faces of members of the public from live CCTV feeds and processes them in real time to extract facial biometric information. “That information is then compared with facial biometric information of persons on a watchlist,” according to a press summary of the case. Watchlists are compiled from images on a database maintained by the South Wales Police, which has deployed AFR Locate 50 times to date.
The Court concluded that the use of AFR Locate by the South Wales Police met the requirements of the Human Rights Act.
In relation to Data Protection claims, the Court concluded that when the South Wales Police collected and processed images of members of the public, it was collecting and processing their personal data and that this processing of personal data was lawful and met the conditions set out in the legislation.
In response to the ruling, Bridges said he will appeal and vowed to keep fighting against “unlawful use” of “this sinister technology.”
In a statement issued by civil liberties organisation Liberty, which instructed Dan Squires and Aidan Wills of Matrix Chambers on behalf his behalf, Bridges brought up the issue, well-known to people in the data world, of informed consent. He said: “South Wales Police has been using facial recognition indiscriminately against thousands of innocent people, without our knowledge or consent.”
Megan Goulding, a lawyer at Liberty, used similarly strong language saying that the technology is invasive and doesn’t belong in the UK. She stated that the judgement was disappointing, that facial recognition is a very serious threat to our rights and freedoms and called on the government to recognise the “danger this dystopian technology presents to our democratic values.”
The South Wales Police took somewhat of a conciliatory stance on the ruling by welcoming the decision while also calling for wider debate on the issue of AI and “face-matching technologies”. Although it was revealed that other bodies were involved in the decision to deploy this technology, including the Ethics Committee, perhaps academics and non-partisan think tanks could also have some input in the future.
Chief constable Matt Jukes said: “With the benefit of this judgement, we will continue to explore how to ensure the ongoing fairness and transparency of our approach. There is and should be a political and public debate about wider questions of privacy and security. It would be wrong in principle for the police to set the bounds of our use of new technology for ourselves.”
Essentially Jukes is saying that the police, in South Wales at least, do not want to write their own rulebook, however this highlights the fact that as yet there are few if any rules on the use of automated facial recognition.
The Metropolitan Police’s trial of automated facial recognition technology has come under increased scrutiny since January 2019. During a trial in Romford, Essex earlier this year and a member of the public covered his face as he passed by and was summarily forced by the police to uncover his face, was photographed and issued with a £90.
When London Mayor Sadiq Khan was asked about this at Questions to the Mayor in March, he said: “No-one was stopped for avoiding or refusing to be scanned. Refusing to be viewed by the cameras on its own is not grounds to stop someone.” This is a curious response considering the publicly accessible video footage.
In response to the ruling, the Metropolitan Police said it has trialled AFR on 10 occasions since 2016 “with some notable successes” and will consider all available information before coming to a decision on how the technology may be used in the future.
In a statement, it said: “The implications of this ruling for the MPS will now be carefully considered before a decision is taken on any future use of live facial recognition technology.”
There was no indication of when this decision will take place. Privacy organisation Big Brother Watch is awaiting an announcement on this matter. Director Silkie Carlo said: “We are now waiting for the Metropolitan Police to decide whether they intend to use live facial recognition surveillance again. If they do, we’ll take them to court.”
In regard to the judgement, she called it profoundly disappointing. She referred to an independent review of the use of surveillance by the Met Police, calling it “utterly damning” and “staggeringly inaccurate.”
Carlo also referred to the lack of legislation on the topic, stating: “There has not been a single debate in the House of Commons on live facial recognition nor a single British law that contains the words facial recognition, yet we now have an epidemic of surveillance across the country.” Big Brother Watch is calling for a total ban on the use of the technology.
South Wales Police and Crime Commissioner Alun Michael took similarly conciliatory tone to that of Jukes, saying that while he is pleased with the ruling, public debate should continue. He said: “The conclusion of the Judicial Review process will not mark the end of the public debate and nor should it.”
An important question to ask is who will be included in the debate or conversation and if all voices are equal, will some voice be more heard more than others?