Express & Star

Big Brother, or protecting the public?

The details are vague, the precise circumstances were never made public. But on May 31, 2017, three days before a high-profile football match, an unidentified man from Cardiff unwittingly became part of police history.

Published
Facial recognition technology measures the distance between key features

It was three days before the city hosted the UEFA Champions League Final, and as part of the security operations for the game South Wales Police was trialling the use of facial recognition cameras. Why it was being used three days before the game has never been explained – it may simply have been officers testing it out ready for the event – but the man, who was wanted on bail, became the first to be arrested using the technology.

A vital tool in the fight against crime, or an infringement of civil liberties? There are plenty of people who will make both arguments, but the one thing we can be certain of is that the technology is growing at breakneck pace, and one way or another it is likely to play a growing role in our everyday lives.

Aside from the increasing use of facial recognition by many online service providers to 'verify' personal data, it has emerged this month that Birmingham's Millennium Point conference centre, the Meadowhall shopping centre near Sheffield, Liverpool's World Museum, and the King's Cross development in London have all been using the equipment.

Meadowhall's owner British Land says the company does not operate facial technology at any of its sites at the moment. But a spokeswoman admits that the centre did, in conjunction with the police run a 'short trial' last year, adding that all the data was deleted immediately afterwards.

The comments have done little to reassure Big Brother Watch chief executive Silkie Carlo says: "There is an epidemic of facial recognition in the UK. The collusion between police and private companies in building these surveillance nets around popular spaces is deeply disturbing."

It seems the concerns are shared by Information Commissioner Elizabeth Denham also shares some of these concerns, announcing an investigation into the use of AFR at the sprawling King's Cross development, which encompasses 50 separate buildings.

She says: “We have launched an investigation following concerns reported in the media regarding the use of live facial recognition in the King's Cross area of central London, which thousands of people pass through every day.

“As well as requiring detailed information from the relevant organisations about how the technology is used, we will also inspect the system and its operation on-site to assess whether or not it complies with data protection law."

Last week Manchester City FC was forced to deny reports it was planning to use automatic facial recognition (AFR) at its home ground, the Etihad stadium, in an effort to reduce queuing at the turnstiles on matchdays. People who have used the 'fast-track' passport lanes at airports may question whether it would do anything to reduce the length of queues, given the scope for error in recognising faces. But while there is still a long way to go, there is no doubt that the technology is improving.

Independent tests by the US National Institute of Standards and Technology (Nist) found that between 2014 and 2018 facial recognition systems became 20 times more accurate at finding a match in a database of 12 million portrait photographs. The failure rate fell from four per cent to 0.2 per cent over the same period as a result of developments described as an 'industrial revolution' in facial recognition.

But critics argue that these tests do not reflect 'real world' conditions of varying picture quality and out-of-date file photographs. Last month the Metropolitan Police ended a trial after it was found that the cameras identified the wrong people 81 per cent of the time.

In June this year, the Home Office announced that West Midlands Police would be one of three forces that would trial the use of AFR to help track down missing and vulnerable people. But despite the offer of Government funding, the force this week announced it would not be going ahead with the scheme. Indeed, a spokesman for the force denied that it had ever agreed to take part in the first place.

By contrast, the South Wales force has been very much at the forefront of the technology, and this month announced that 50 officers would be issued with a facial-recognition phone app, despite a legal challenge going through the courts at the moment brought about by a man who objected to being photographed by officers while out shopping.

Civil rights group Liberty said it was "shameful" the South Wales force was using the technology while court cases were ongoing.

Deputy Chief Constable Richard Lewis says: "This new app means that, with a single photo, officers can easily and quickly answer the question of 'are you really the person we are looking for?'.

"Officers will be able to access instant, actionable data, allowing to them to identify whether the person stopped is, or is not, the person they need to speak to, without having to return to a police station."

But the force's use of the technology has not been without its problems. A study by Cardiff University the force’s NEC NeoFace system froze, lagged and crashed when the screen was full of people and performed worse on gloomy days and in the late afternoon because the cameras ramped up their light sensitivity, making footage more 'noisy'.

During 55 hours of deployment the system flagged up 2,900 potential matches of which 2,755 proved to be false. While scanning the crowds at Welsh rugby matches, the system spotted one woman on the South Wales Police 'watch list' 10 times. None of them were her.

South Wales Police used the system in 18 arrests, but the report does not say how many were charged.

Responding to a request under the Freedom of Information Act in December last year, West Mercia Police confirmed that it had not paid for the use of any facial recognition technology. However, the West Midlands force declined to answer the question, citing operational security, although the force's refusal to take part in the missing persons scheme would suggest it is not using the system at the moment.

The technology is not new. As long ago as 1964, American mathematician Woody Bledsoe began working on a system using a computer to recognise human faces. However, because his work was funded by an unidentified intelligence agency, it did not come to widespread public attention. As a general rule, the cameras measure the distances between the eyes, nose, mouth, and other features, to come up with a unique mathematical algorithm for each person. It is extensively used in China

What is new is the way that developments in artificial intelligence and the internet have facilitated the growth in the technology. And while the authorities and private industry might be subject to data protection laws regarding the images they keep on their own files, the growth of social media also provides a rich harvest of photographs that can be used to create unofficial 'watch lists'.

Whatever its merits or otherwise of facial recognition technology, it cannot be un-invented.

Striking the balance between protecting the public and protecting our liberties is a debate that is not going to go away.