Facial recognition: What’s the issue?


Share this content


Though many people are comfortable using facial recognition on their phones, there is an uphill struggle for its use by government and law enforcement, reports Philip Ingram MBE.

People don’t think twice when they look at their smart phone, swipe up and the little lock symbol opens and all the app icons appear on the screen.  

In the microseconds from swiping up, the phone has taken an image of the user, mapped the features of that image, compared it with the algorithmic representation of what it has in its database when the phone was set up and either opens the phone or asks for a manual passcode.

Facial recognition, or live facial recognition technology (LFR) is the software that maps, analyses and then confirms the identity of a face in a photograph, video or live from a device – it is one of the most powerful surveillance tools ever made. It is only when used by elements of the state that people suddenly become concerned about where and how their facial recognition data is being used.

Professor Fraser Sampson, the Biometrics and Surveillance Camera Commissioner, said: “If we are to harness the significant benefits of emerging technology in this area in a lawful, ethical and accountable way, we need to build trusted surveillance partnerships. To do that, we must be able to trust our surveillance partners in respect of both the human rights and security considerations.”

With phones and other personal devices, you can choose to opt out of using facial recognition, but you have no control when it is the state using the tools. Therefore, society has some big questions to answer regarding how facial recognition should be regulated as well as what privacy sacrifices we are each willing to make.

However, there is a perception from films and TV series that the police and security services can sit in a big control room and track people remotely using the over 600,000 CCTV cameras in London alone, one for every 14 people, using facial recognition capabilities, seamlessly tracking suspects and keeping surveillance operators informed. Unfortunately, that capability isn’t available outside the TV or film set; surveillance remains a people-power intensive job even if enhanced by tools such as facial recognition.

How does facial recognition technology work?

There are three main elements to facial recognition: Detection, analysis and recognition. Detection is simply the process of finding a face in an image. Analysis or attribution is the step that maps faces.  Typically, this measures the distance between the eyes, the shape of the chin, the distance between the nose and mouth and then converts that into a string of numbers or points, often called a “faceprint.” Finally, we have recognition, which is the attempt to confirm the identity of a person in a photo by matching the faceprint against a known database.

One of the controversies surrounding use of facial recognition technologies by the state, and especially law enforcement, is the number of false positives it gives. Professor Peter Fussey and Dr Daragh Murray from the Human Rights Centre at the University of Essex were commissioned by the MET Police to examine various trails of what they refer to as LFR, conducted between 2016 and 2019. 

The report documented: “Significant operational shortcomings in the trials which could affect the viability of any future use of LFR technology. The researchers also found it ‘highly possible’ that police deployment of LFR technology would be held unlawful if challenged in court. Alerts issued by the technology were verifiably accurate in less than 20% of cases, suggesting significant unnecessary stops, with social and legal consequences.”

Considering their findings, Professor Fussey and Dr Murray “called for all live trials of LFR to be ceased until these issues were addressed; noting that it was essential that human rights compliance is ensured before deployment, and that there be an appropriate level of public scrutiny, debate and leadership on a national level before proceeding.”

However, with any new technology we can’t ‘uninvent’ or ban it, so the questions of what went wrong and how do we fix it, must be asked. It is important that there needs to be a legislative underpinning to ensure it can be used only when necessary, used proportionately and the data can be properly protected to ensure peoples freedoms are properly protected. This is an area Professor Sampson is working hard with.

“A high bar”

The next issue is the data flaws that led to the false positives. The detection phase of facial recognition starts with an algorithm that learns what a face is. Usually, the creator of the algorithm does this by “training” it with photos of faces and this is where some of the problems begin. The diversity of photos fed into the system has a profound effect on its accuracy during the analysis and recognition steps. Ensuring the base data underpinning the algorithm development is not just reflective of part of society but of all of society in a way that bias is removed, is one of the biggest challenges.

Another challenge is around the fact that the data being processed is Biometric data. According to the Information Commissioner: “LFR involves the processing of personal data, biometric data and, in the vast majority of cases seen by the ICO, special category personal data.

“While the use of LFR for law enforcement is covered by Part 3 of the Data Protection Act 2018 (DPA 2018), outside of this context the relevant legislation is the UK General Data Protection Regulation (UK GDPR) and the Data Protection Act 2018 (DPA 2018).” The compliance legislative framework is already complex, whereas the use legislative framework is non-existent.

The Information commissioner has set out the principals behind any use legislative framework by saying: “Central legal principles to consider before deploying LFR are lawfulness, fairness and transparency, including a robust evaluation of necessity and proportionality.” These reflect the discussions Professor Sampson has had across the industry, but the Information Commissioner adds: “These requirements mean that where LFR is used for the automatic, indiscriminate collection of biometric data in public places, there is a high bar for its use to be lawful.”

This article was originally published in the June edition of ISJ. To read your FREE digital copy, click here.

Receive the latest breaking news straight to your inbox