Anyone in any location where their biometric facial data is being recorded and stored, and possibly passed on and processed, needs to be alerted to this fact. They also need to give their fully-informed consent for this to go ahead as well as be given the option to opt out. But this didn’t happen at a new development in London’s King’s Cross.
In mid-August, the Financial Times reported that Argent, the property developer of the King’s Cross site, is using facial recognition technology. A few days later on 15th August, the Information Commissioner’s Office launched an investigation. In a statement, Information Commissioner Elizabeth Denham said that she remains deeply concerned about the use of facial recognition technology in public spaces.
She said: “Scanning people’s faces as they lawfully go about their daily lives, in order to identify them, is a potential threat to privacy that should concern us all. That is especially the case if it is done without people’s knowledge or understanding.”
A comment which seems to hit back proactively at calls for the use of facial recognition for security purposes, stated: “We support keeping people safe but new technologies and new uses of sensitive personal data must always be balanced against people’s legal rights.”
What data could be more sensitive than one’s physical person? Like the Denham, I too am deeply concerned. It is a gross invasion of privacy. The UK is already a country under heavy surveillance with the number of CCTV security camera estimated to be between four and six million.
Law enforcement in the UK has trialled facial recognition technology in South Wales at shopping centres, outside football stadiums and on high streets. The Metropolitan Police has trialled automated facial recognition technology (AFR) at Notting Hill Carnival, Romford High Street and at the Cenotaph memorial. I attended Notting Hill Carnival over the weekend and saw a wooden tower with a police officer in it. I later heard whispers that it was there to facilitate the use facial recognition.
It is immensely troubling when this technology is used by the police but it is almost just as bad when it is carried out by a private, commercial organisation. People go in and out of the adjacent King’s Cross and St Pancras stations approximately 67,000 times a year. That is a lot of faces to process.
How much would it cost in terms of manpower, computing power and the licensing of the technology to gather and store this vast amount of information? I’m guessing that it is not cheap. And so how would Argent make a return on that investment? I imagine the only way would be to sell it on to third parties.
Perhaps organisations that want to use this type of technology on the general public, or even on private property as well if it is large enough, should get authorisation and permission from an overseeing body such as the ICO. The organisations would then be obligated to operate under certain conditions such clearly notifying the users of the space, getting consent and informing of the opt-out process. The authorising body could then keep a register of organisations using the technology and keep tabs on them and force them to show what it is being used for and to delete the data if necessary.
An Argent spokesperson did not provide any details about the number of cameras used, or the legal basis for the use of this technology but did say it was compliant with GDPR. I fully support the ICO’s investigation to find out if this is really the case.
Thank you for your input
Thank you for your feedback
DataIQ is a trading name of IQ Data Group Limited
10 York Road, London, SE1 7ND
Phone: +44 020 3821 5665
Registered in England: 9900834
Copyright © IQ Data Group Limited 2024