The perils of Facial Recognition Cameras

Priyanka Mehta Tuesday 18th February 2020 15:32 EST
 
 

Is the UK moving towards an authoritarian government with the deployment of a China-sytle model of facial recognition cameras?

On 11th February, the Metropolitan police began the operational use of facial recognition CCTV Cameras in London’s Stratford Centre retail complex. Thousands of shoppers were scanned by van-mounted cameras pointed at the doors of the complex, in what was reportedly an “intelligence-led” deployment. However, this comes despite a scathing assessment of its effectiveness from the very expert who was hired to scrutinise its trials. However, what are facial recognition cameras and why are they being deployed by the government across the country?

FRCs involve cataloguing and recognising human faces, typically by recording the unique ratios between an individual’s facial features, such as eyes, nose and mouth. The most controversial strand of the technology involve using facial features as biometric identifiers, that is, to identify individuals based on just a photo or video of their face. Now the government claims that these FRCs can be instrumental in curbing the growing knife-attack and stabbing related incidents by potentially identifying these offenders.

Stop and search related discriminatory approach

Saqib Deshmukh, is an activist and a youth worker at the Voice4Change England. He is involved with campaigns around policing and stop and search policies. Speaking about his concerns around FRCs, he says,

“I am concerned about the nature of this rollout especially in terms of how little accountability there is with regards to privacy and data protection. This has been trialed out at places like Romford etc. Previous research has indicated that 93% of those who were stopped during these trials were wrongly identified. Now, even the independent review commissioned by the Met Police states that this system was accurate only in 1 in 5 cases.

“This means that the actually technology is faulty. It is a policing system similar to the stop and search aspect which is not tested properly and found to be inaccurate. The AI software behind this is also faulty in that it wrongly identifies BAME and ethnic minorities and further risks their isolation.”

Invasion to privacy and data protection

However, the government believes that this system fuelled by a large database of labelled data can enable police to pinpoint a person of interest across a city of networked cameras. This would then help the government in curbing the recent terror-related attacks almost equivalent to the stop and search approach undertaken by the Met Police. But the question around an individual’s privacy remains unaddressed. Race equality campaigners, including the family of Stephen Lawrence, have been spied upon; whilst climate activists from Extinction Rebellion were put on an extremism watchlist. However, it is interesting to note that while the European Union is considering a temporary ban on the use of facial recognition to safeguard individuals’ rights, the UK seems to be moving towards the opposite end. Rabina Khan is a Liberal Democrat councillor in Tower Hamlets council. Speaking about her concerns around the rollout of FRCs, she says,

“Generally speaking, any technology that is used responsibly to help identify a potential terrorist or suspect of any crime, or a missing person, should be a welcome move. However introducing this highly sophisticated technology into public spaces without prior public debate is bound to cause concern. However when applying this technology it should be intelligence led and ethically implemented.

“The cameras can only identify faces, not prevent knife-attacks or terror threats. In this instance, they would only be helpful if a suspect was seen brandishing a knife, or committing an offence, which can also be captured on existing cctv cameras. Despite the above, I can understand some people’s concerns about invasion of privacy and the way in which this technology will be used.

“There is also the question about accuracy and the fact that the technology is not yet advanced enough to be completely reliable. Therefore, it is possible that this information is not stored or used in any way that will be detrimental to innocent people. Some people have compared it to a virtual identity parade. In trials carried out between 2016 and 2018, for example, there was a 96% rate of false positives and only 8 arrests from a facial recognition match.”

Additionally Facial Recognition CCTV cameras could identify and record who attends a protest. They could automatically flag suspicious behaviour, or people who look a certain way, which could be particularly problematic for groups already stopped and searched disproportionately. This is precisely how the technology is already being used in China and elsewhere.

However, following the deployment in London, Commander Mark McEwen, the Met’s lead on crime prevention, said Stratford had been chosen because it had been the scene of “public space violence”, and that there was support from the community for the police to use “whatever tactic we can to deal with violence”.


comments powered by Disqus



to the free, weekly Asian Voice email newsletter