Ousmane Bah, a young black man from New York was arrested in November in connection with a theft from Apple’s Manhattan store. When Bah was ruled out as being the perpetrator in New York, an NYPD detective gave him his theory as to why the youth was arrested.
The actual thief, who is also black, found a photo-less drivers’ permit and presented it to authorities when he was caught. Apple then linked the real thief’s actions to Bah’s face and name which led him to being accused of thefts in Delaware, Massachusetts, New Jersey and New York.
Charges in three jurisdictions have been dropped, however the case in New Jersey is still pending. The lawsuit states that Bah was “forced to respond to multiple false allegations which led to severe stress and hardship.”
This seems to be a case of putting the wrong name to the face. However, Apple spokespersons have told news outlets that that facial recognition systems are not used in its stores.
I’m sceptical. In my view, it is possible that the company had someone manually go through hours of footage recorded in the stores around the time the thefts took place to find the person purporting to be Bah. However, considering the high-profile roll out of FaceID in the iPhone X, I think it is far more likely there was some cross-pollination of its consumer-focused facial recognition technology.
I also think it was naïve and almost gullible for the corporation to take the word of a red-handed criminal that he is who he says he is, and fact checking should have taken place. Multi-factor authentication, if you will.
This story raises some important questions about transparency. Bah’s lawyer, Subhan Tariq said: “The onus is on Apple to answer how did they identify my client as the perpetrator of a crime if they weren’t using facial recognition.”
I have some questions too. Did the tech giant, in fact, use facial recognition technology on its store security footage? What was the level of involvement and collaboration between law enforcement and the iPhone maker when it came to tracking down Bah to his home address? Was any of the information about Bah that was used against him, obtained via his use of Apple devices or even social media? Did a person or a machine work on that basis that all black people look alike?
The issue I see is one instance of theft and misidentification occurred and had a negative effect on Bah. However as a result manual or automated facial recognition Bah was on the hook for multiple thefts, compounding the negative harm done to him.
It is certainly possible to see why the teenager might assume that facial recognition systems are not skewed in his favour.
We all know about how artificial intelligence and machine learning, especially when applied in facial recognition disadvantages certain groups. Facial recognition is most accurate with identifying white men, due underrepresentation of the faces of people of colour and women in the data used to train the algorithms. Researcher Joy Buolamwini refers to these people as the ‘undersampled majority’.
Perhaps this lawsuit will serve as a wake-up call for technology companies to improve the accuracy of facial recognition and make sure that all demographic groups are treated fairly and equally. If not, a precedent has been set for eye-watering lawsuits to be headed their way.
Thank you for your input
Thank you for your feedback
DataIQ is a trading name of IQ Data Group Limited
10 York Road, London, SE1 7ND
Phone: +44 020 3821 5665
Registered in England: 9900834
Copyright © IQ Data Group Limited 2024