Current track

Title

Artist

Current show

Toria Christie

1:00 pm 4:00 pm

Current show

Toria Christie

1:00 pm 4:00 pm

Background

Police use of facial recognition unlawfully breached privacy rights, says Court of Appeal ruling

Written by on 11/08/2020

A British police force’s use of facial recognition technology has been ruled unlawful in a landmark judgment at the Court of Appeal.

The technology, which is used to allow officers to compare the faces of pedestrians against a database of persons of interest, is now effectively outlawed until it can be brought before parliament.

In a unanimous ruling, three judges found South Wales Police had breached privacy rights, data protection laws and equality legislation by deploying it.

According to the court, there are “fundamental deficiencies” in the legal framework supporting the police’s use of facial recognition technology, which caused the breaches of these rights.

While the Court cautioned that the judgment was “not concerned with possible uses of AFR (Automated Facial Recognition) in the future on a national basis”, the case law it has established will necessary apply to those uses of the technology as well.

Real-time use of facial recognition software has been controversial as it compares members of a crowd who are not suspected of any crimes against a police database, potentially putting them at risk of arrest if the system delivers a false positive.

Among the concerns expressed by campaigners were that the technology particularly struggled to accurately identify people who weren’t white and male.

“The fact remains, however, that [the police] have never sought to satisfy themselves, either directly or by way of independent verification, that the software programme in this case does not have an unacceptable bias on grounds of race or sex,” the ruling stated on Tuesday.

The low levels of accuracy of the technology, which is provided by NEC Corporation, were highlighted in a report for London’s Metropolitan Police which found that more than four out of every five people it identifies as possible suspects are actually innocent.

That report, commissioned by Scotland Yard, raised “significant concerns” about the use of the technology, and called for the facial recognition programme to be halted.

The case against South Wales Police was brought by Cardiff resident Ed Bridges, 37, alongside campaigning organisation Liberty, after Mr Bridge’s face was scanned while he was out shopping.

Mr Bridges said: “I’m incredibly, ecstatically pleased by today’s judgment on the case I brought with Liberty against the use of automatic facial recognition technology by South Wales Police.

“Automatic facial recognition technology is an intrusive and discriminatory mass surveillance tool.

“It has been used without the public’s consent and often without their knowledge. We should all be able to use public spaces without being subjected to oppressive surveillance.”

South Wales Police had previously said it welcomed the scrutiny of the court on the issue of automated facial recognition (AFR), which is when the public is scanned in bulk, and is not going to appeal Tuesday’s ruling.

Earlier this year, the Met’s senior technologist Johanna Morley told Sky News that huge investment would be needed to upgrade police IT systems in order to ensure the people on these watch lists were there legally.

Liberty lawyer Megan Goulding said: “This judgment is a major victory in the fight against discriminatory and oppressive facial recognition.

“The court has agreed that this dystopian surveillance tool violates our rights and threatens our liberties.

“Facial recognition discriminates against people of colour, and it is absolutely right that the court found that South Wales Police had failed in their duty to investigate and avoid discrimination.”

She added: “It is time for the government to recognise the serious dangers of this intrusive technology. Facial recognition is a threat to our freedom – it has no place on our streets.”

South Wales Chief Constable Matt Jukes said: “The test of our ground-breaking use of this technology by the courts has been a welcome and important step in its development. I am confident this is a judgment that we can work with.

“Our priority remains protecting the public, and that goes hand-in-hand with a commitment to ensuring they can see we are using new technology in ways that are responsible and fair.”


Analysis: This doesn’t mean the police will never be able to use facial recognition

By Rowland Manthorpe, technology correspondent

This case was the first ever brought against police use of facial recognition, so its result is hugely significant. Its subject wasn’t the police, or even the technology, but the law itself.

Did existing legislation give the police the right to scan hundreds of thousands of faces in order to catch a single criminal? Did it offer sufficient safeguards to protect innocent citizens?

The Court of Appeal decided that it did not, overturning a large part of last year’s original verdict, which found emphatically in favour of the police.

This doesn’t mean the police will never be able to use facial recognition. Rather, it puts the emphasis back on the government.

If it wants to deploy this extraordinarily invasive technology for the purpose of public safety, then it will need to come up with laws that make that possible.

Home Office, it’s over to you.

(c) Sky News 2020: Police use of facial recognition unlawfully breached privacy rights, says Court of Appeal ruling