The Metropolitan Police should reconsider their use of facial recognition technology in light of a landmark judgement from the Court of Appeal, London Assembly members have said.

Last Tuesday (August 11), the court ruled that South Wales Police, which also uses the controversial software, breached privacy rights and equality laws.

Now Assembly members say the Met should adapt its own use of the technology in light of the judgement.

Facial recognition compares live images of the public with a database of people which can include those wanted for crimes or believed to be at risk.

It uses biometric mapping, measuring tiny details of faces, to suggest matches with the watchlist.

But critics say it infringes the privacy rights of innocent people, and could be biased.

A 2018 study by the Massachusetts Institute of Technology surveyed three brands of the software, and recorded an error rate of between 21 and 35 per cent for women of colour, compared to just one per cent for white men.

South Wales Police was challenged in court by Ed Bridges, a camapigner who claims the indiscriminate nature of the technology makes it disproportionate.

His claim was rejected by the High Court, but the Court of Appeal found in his favour on several counts.

It said Welsh police officers discretion when using the software was “too broad”, and they failed to properly investigate how it could discriminate on the basis of gender and race.

The Welsh force was the first in Britain to use the technology, which is widespread in the USA.

But in February the Met followed suit – live facial recognition has since been used to scan shoppers on Oxford Street.

Both police forces use NeoFace Watch, supplied by Japanese company NEC.

Responding to the ruling, Liberal Democrat Caroline Pidgeon said the Met should carefully consider the court’s “significant” findings.

“While it is true that the ruling relates to specific uses of live facial recognition technology by South Wales Police, the Met needs to face up to the fact that they have relied on previous judgements in this case when designing and justifying their use of the technology to date,” she explained.

“Without any clear legal framework in place governing the technology, I still firmly believe that its use should be paused.”

Ms Pidgeon said if the Met will not halt its use of the software it should at least make “significant changes” to ensure it is fully transparent.

“At a time when face masks are now widespread, it is baffling that the Met are so obsessed with this unproven and expensive technology,” she added.

Green Assembly member Sian Berry accused the Met of “massive mission creep” in its use of facial recognition.

Thought the force claims the software is used only to tackle high profile offences, it could be used in less serious situations in future, she warned.

Ms Berry and Ms Pidgeon wrote to Police Commissioner Cressida Dick in May calling for clarity on how the technology is used.

In a response seen by the Local Democracy Service, Met intelligence director Lindsey Chiswick refused to rule out using the software in any situation where there is “a clear, lawful basis” in future.

Ms Berry fears this could include public events and protests.

“They can bend the rules, they can mission creep, and we have no way to stop them at the moment,” she said.

“This appeal makes it clear that proportionality is on a case by case basis.

“I think there are aspects of that judgement that mean the police need to be setting out who they are using this on.”

A spokesperson for the Met said the force was aware of the ruling, and would “carefully consider” its relevance to operations in the capital.

The use of facial recognition in London is “different” from the Welsh force because of “the complexity of keeping London safe” and “different crime issues impacting the capital”, they said.

“The Metropolitan Police Service has been clear that our deployments are intelligence-led, and focused on helping tackle serious crime, including serious violence, gun and knife crime, child sexual exploitation and helping protect the vulnerable,” they said.

“We communicate about each deployment in advance of, during and afterwards to ensure awareness and transparency, we use the latest very accurate algorithm, and we have our own bespoke and carefully considered policy documents. “