South Wales police lose landmark facial recognition case

<span>Photograph: Matthew Horwood/Getty Images</span>
Photograph: Matthew Horwood/Getty Images

Campaigners are calling for South Wales police and other forces to stop using facial recognition technology after the court of appeal ruled that its use breached privacy rights and broke equalities law.

The demand by Liberty followed a landmark ruling in a case brought by Ed Bridges, a civil liberties campaigner, who had argued that the capturing of thousands of faces by the Welsh force was indiscriminate and disproportionate.

On two key counts the court found that Bridges’ right to privacy under article 8 of the European convention on human rights had been breached, and on another it found that the force failed to properly investigate whether the software exhibited any race or gender bias.

Liberty, which was a party to the case in support of Bridges, said the verdict amounted to a “damning indictment” and called on the force to “end immediately” its use of facial recognition software as a result.

Louise Whitfield, Liberty’s head of legal casework, said: “The implications of this ruling also stretch much further than Wales and send a clear message to forces like the Met that their use of this tech could also now be unlawful and must stop.”

But the South Wales police force said it was confident that “this is a judgment that we can work with” and said its use of the technology – a pilot for forces across England and Wales – was expected to continue with minor changes.

It said live-time deployments in the force’s area had resulted in 61 people being arrested for offences including robbery and violence, theft and court warrants. No unlawful arrests had been made, it added.

Facial recognition technology maps faces in crowds and compares them to images of people on a watchlist, which can include suspects, missing people and other persons of interest to the police.

Other forces have begun to follow South Wales’s lead. In February, the Met announced plans to deploy live systems, which automatically scan against 5,000 biometric profiles, in shopping centres and other crowded areas of London.

Graphic

Bridges’ case had previously been rejected by the high court, but the court of appeal ruled in his favour on three counts.

It held that Bridges’ right to privacy, under article 8 of the European convention on human rights, was breached because there was “too broad a discretion” left to police officers in applying the technology. South Wales police also breached their public sector equalities duty, the judges concluded, by failing to properly investigate whether the facial recognition algorithms were biased in terms of race or sex.

Ed Bridges
Civil liberties campaigner Ed Bridges, who brought the case. Photograph: PA

In a key passage, the judges concluded: “We would hope that, as AFR [automatic facial recognition] is a novel and controversial technology, all police forces that intend to use it in the future would wish to satisfy themselves that everything reasonable which could be done had been done in order to make sure that the software used does not have a racial or gender bias.”

Daragh Murray, a senior lecturer at the school of law at Essex University, said he believed the judgment meant police forces could not be confident that their use of the technology would survive legal challenge in future trials. “The only way this can be resolved for sure is if proper legislation is now introduced.”

Concerns have been raised that facial recognition technology can be racially and gender biased, in particular that it is less effective in accurately distinguishing black people – although few studies have been conducted by the authorities in the UK.

In 2018, a researcher at MIT’s Media Lab in the US concluded that software supplied by three companies made mistakes in 21% to 35% of cases for darker-skinned women. By contrast, the error rate for light-skinned men was less than 1%.

Following the ruling, Matt Jukes, the chief constable of South Wales police, said he recognised that “public confidence, fairness and transparency are vitally important”, and that “academic analysis” had been commissioned to look at the performance of the NeoFace Watch software, supplied by the Japanese firm NEC.

A hearing last month was told the South Wales force had already captured 500,000 faces, the overwhelming majority of whom were not suspected of any wrongdoing. Bridges’ face was scanned while he was doing Christmas shopping in Cardiff in 2017 and at a peaceful anti-arms protest outside the city’s Motorpoint Arena in 2018.

The 37-year-old former Liberal Democrat councillor said: “For three years now South Wales police has been using it against hundreds of thousands of us, without our consent and often without our knowledge. We should all be able to use our public spaces without being subjected to oppressive surveillance.”

South Wales police said it did not intend to take the case to the supreme court, but would instead refine its policies. “The court of appeal’s judgment helpfully points to a limited number of policy areas that require this attention,” Jukes said.

Last year, the King’s Cross Central Limited Partnership became one of the first property companies to say it had used facial recognition software in two street cameras at its London site until 2018 for reasons of “public safety”, but after an outcry it said it had abandoned plans to deploy the controversial technology more widely.