The Metropolitan Police have been getting too up close and personal with the citizens of London for my liking. Residents and commuters in Romford were subject to a sting operation by the Met which trialled biometric face recognition technology, on Thursday 31st January, three days after international Data Privacy Day. As a result there were eight arrests. Three were of people wanted for violent offences, while the remaining five “were proactive arrests as part of the wider operation.”
One man was fined £90 after covering his face in the vicinity of the cameras. The police claimed he was acting suspiciously and stopped him, then stated that he became “aggressive and made threats towards officers.”
I am not a fan. I don’t think it should be happening. This is because facial recognition technology is renown for its inaccuracy.
It has been well documented that facial recognition technologies are woefully wrong when it comes to identifying Black women and have much higher accuracy rates with males of European descent.
There was also that famous example of the Chinese woman whose iPhone X could not differentiate her from her colleague and duly unlocked to the person who was not the owner. This creates two problems.
White males are more likely to be (correctly) identified and apprehended with this kind of technology. Furthermore, males of other ethnicities and women of all ethnicities are more likely to come up as a false positive. These are both bad outcomes.
I don’t know how the Met would deal with a false positive but I guess involve the misidentified person having much closer contact with police force than they otherwise would have in order to clear their name and face. If the person does go to prison, it could have devastating consequences to their physical and mental health, as well as having a serious impact on their loved ones. If they are exonerated, their career would be in tatters, they would need therapy and a chunk of compensation change. Any conviction or criminal record resulting from misidentification is heinous miscarriage of justice.
Supporters would say that this type of technology is just an extension of CCTV that us Brits have accepted over the last 15 years without much fuss. It could mean that police would be freed up from the labourious task of tracking down and identifying suspects, making apprehension easier. They would then be given the time to focus on more intelligence-based work such as infiltrating drug and human trafficking gangs and bringing down corrupt officials.
I would love for these things to happen but not at the expense of the freedom of innocent people who would have their names and reputations tarnished because all people of colour look alike to the machines. People could be sent to prison erroneously, lives and livelihoods could be wrecked. I don’t think that is a price worth paying. The Innocence Project Network is an affiliation of 69 non-profit organisations around the world that work to exonerate wrongfully convicted people. Their caseload would surely increase if flawed facial recognition technology were deployed on a massive scale.
Ultimately the problem stems from the training data used to teach the facial recognition algorithms having an over representation of white male faces. Until the faces in the training data become more reflective of the real world, facial recognition algorithms will continue to only work for a small segment of the population. While that is the case, the risk of misidentification and injustice is too great to deploy this technology in law enforcement.
Thank you for your input
Thank you for your feedback
DataIQ is a trading name of IQ Data Group Limited
10 York Road, London, SE1 7ND
Phone: +44 020 3821 5665
Registered in England: 9900834
Copyright © IQ Data Group Limited 2024