Unit 2, Eliburn Office Park, Livingston, EH54 6GR
+441506668613

Police Face Recognition Software Flawed

Providing IT support and solution to small and medium businesses. Servicing Edinburgh, Livingston, Fife and surrounding areas. Responsive, Flexible, Professional and friendly local support.

Police Face Recognition Software Flawed

Following an investigation by campaign group Big Brother Watch, the UK’s Information Commissioner, Elizabeth Denham, has said that the Police could face legal action if concerns over accuracy and privacy with facial recognition systems are not addressed.

What Facial Recognition Systems?

A freedom of information request sent to every police force in the UK by Big Brother Watch shows that The Metropolitan Police used facial recognition at the Notting Hill carnival in 2016 and 2017, and at a Remembrance Sunday event, and South Wales Police used facial recognition technology between May 2017 and March 2018. Leicestershire Police also tested facial recognition in 2015.

What’s The Problem?

The two main concerns with the system (as identified by Big Brother Watch and the ICO) are that the facial recognition systems are not accurate in identifying the real criminals or suspects, and that the images of innocent people are being stored on ‘watch’ lists for up to a month, and this could potentially lead to false accusations or arrests.

How Do Facial Recognition Systems Work?

Facial recognition software typically works by using a scanned image of a person’s face (from the existing stock of police photos of mug shots from previous arrests), and then uses algorithms to measure ‘landmarks’ on the face e.g. the position of features and the shape of the eyes, nose and cheekbones. This data is used to make a digital template of a person’s face, which is then converted into a unique code.

High-powered cameras are then used to scan crowds. The cameras link to specialist software that can compare the camera image data to data stored in the police database (the digital template) to find a potential ‘match’. Possible matches are then flagged to officers, and these lists of possible matches are stored in the system for up to 30 days.

A real-time automated facial recognition (AFR) system, like the one the police use at events, incorporates facial recognition and ‘slow time’ static face search.

Inaccuracies

The systems used by the police so far have been criticised for simply not being accurate. For example, of the 2,685 “matches” made by the system used by South Wales Police between May 2017 and March 2018, 2,451 were false alarms.

Keeping Photos of Innocent People On Watch Lists

Big Brother Watch has been critical of the police keeping photos of innocent people that have ended up on lists of (false) possible matches, as selected by the software. Big Brother Watch has expressed concern that this could affect an individual’s right to a private life and freedom of expression, and could result in damaging false accusations and / or arrests.
The police have said that they don’t consider the ‘possible’ face selections as false positive matches because additional checks and balances are applied to them to confirm identification following system alerts.

The police have also stated that all alerts against watch lists are deleted after 30 days, and faces in the video stream that do not generate an alert are deleted immediately.

Criticisms

As well as accusations of inaccuracy and possibly infringing the rights of innocent people, the use of facial recognition systems by the police has also attracted criticism for not appearing to have a clear legal basis, oversight or governmental strategy, and for not delivering value for money in terms of the number of arrests made vs the cost of the systems.

What Does This Mean For Your Business?

It is worrying that there are clearly substantial inaccuracies in facial recognition systems, and that the images of innocent people could be sitting on police watch lists for some time, and could potentially result in wrongful arrests. The argument that ‘if you’ve done nothing wrong, you have nothing to fear’ simply doesn’t stand up if police are being given cold, hard computer information to say that a person is a suspect and should be questioned / arrested, no matter what the circumstances. That argument is also an abdication from a shared responsibility, which could lead to the green light being given to the erosion of rights without questions being asked. As people in many other countries would testify, rights relating to freedom and privacy should be valued, and when these rights are gone, it’s very difficult to get them back again.

The storing of facial images on computer systems is also a matter for security, particularly since they are regarded as ‘personal data’ under the new GDPR which comes into force this month.

There is, of course, an upside to the police being able to use these systems if it leads to the faster arrest of genuine criminals, and makes the country safer for all.

Despite the findings of a study from YouGov / GMX (August 2016) that showed that UK people still have a number of trust concerns about the use of biometrics for security, biometrics represents a good opportunity for businesses to stay one step ahead of cyber-criminals. Biometric authentication / verification systems are thought to be far more secure than password-based systems, which is the reason why banks and credit companies are now using them.

Facial recognition systems have value-adding, real-life business applications too. For example, last year, a ride-hailing service called Careem (similar to Uber but operating in more than fifty cities in the Middle East and North Africa) announced that it was adding facial recognition software to its driver app to help with customer safety.

Book a free consultation call
Close