Campaigner ‘delighted’ at Court of Appeal facial recognition technology ruling

A civil rights activist expressed joy after the appellate court ruled that police's use of facial recognition technology violated privacy and privacy laws.

Ed Bridges filed a legal lawsuit against South Wales Police, arguing that the use of automatic facial recognition (AFR) had caused him "distress".

The 37-year-old had his face scanned in 2017 while shopping for Christmas in Cardiff and at a peaceful anti-weapons protest outside the city's Motorpoint Arena in 2018.

In a ruling on Tuesday, three appeals court judges ruled that the use of the technology was unlawful and allowed Mr. Bridges to appeal on three of the five reasons he put forward in his case.

The ruling does not prevent the armed forces from using the technology, but rather means they must make changes to the systems and policies they use for AFR.

In a statement, Mr Bridges said he was "pleased" that the court found that "facial recognition clearly threatens our rights".

He said, “This technology is an intrusive and discriminatory mass surveillance tool.

“For the past three years the South Wales Police have used them against hundreds of thousands of us, without our consent and often without our knowledge. We should all be able to use our public space without being subjected to oppressive surveillance. "

South Wales Police said the judicial test of their "groundbreaking use of the technology" was a "welcome and important step in its development" and the force would be paying "serious attention" to the results.

Chief Constable Matt Jukes said, “The Court of Appeal's ruling helpfully indicates a limited number of policy areas that require this attention.

"Our guidelines have already evolved since the court cases were examined by the courts in 2017 and 2018. We are currently discussing with the Home Office and the CCTV officer what further adjustments we should make and any other actions that should be taken . "

Mr. Jukes added, “We are pleased that the court recognized that there was no evidence of bias or discrimination in the use of the technology.

"However, issues of public trust, fairness and transparency are vital and the appeals court understands that further work is needed to ensure that there is no risk of us violating our equality obligations. "

The force does not intend to appeal the judgment.

Ed Bridges "Data Title =" Ed Bridges Legal Action "Data Copyright Holder =" PA Media "Data Copyright Notice =" PA Media "Data Credit =" PA "Data Terms of Use =" PICTURE DESK USE ONLY. NO SALE "srcset =" /production/cda719f09a20bd1edbe8d41c0fb19e2bY29udGVudHNlYXJjaCwxNTk3MjM3MTU3/2.43008771.jpg?w=640 640W, https: //image.assets.pressassociation .io / v2 / image / production / cda719f09a20bd1edbe8d41c0fb19e2bY29udGVudHNlYXJjaCwxNTk3MjM3MTU3 / 2.43008771.jpg? w = 1280 1280w "Sizes =", (max-width: 767px (5px) 54 893vx ", max-width: 767px (5px) 54 893vx", (max-width: 767px (5px) 54 893vx ", max-width: 767px (5px) 54 893vx", (max-width: 767px (5px) 54 893vx) />
<figcaption>Ed Bridges said the technology caused him "distress" (PA).</figcaption></figure>
<p>Mr Bridges took his case – believed to be the first in the world on the use of such technology by the police – in the Court of Appeal after his lawsuit was dismissed by the High Court.</p>
<p>In the ruling, the judges said the High Court erred in concluding that the force's interference with Mr Bridges' right to a private life was "in accordance with the law" under human rights law.</p>
<p>They decided that there was no clear guidance as to where AFR Locate – the system used by South Wales Police in an ongoing process – could be used and who could be put on a watch list, and this left police officers with too much discretion .</p>
<p>The judgment said: “The fundamental shortcomings as we see them in the current legal framework relate to two problem areas.</p>
<p>“The first is what was called the“ who question ”at the hearing before us. The second is the "where" question.</p>
<p>“There is currently too much discretion for individual police officers on these two issues.</p>
<p>"It is not clear who can be put on the watchlist, nor is it clear that there are criteria to determine where AFR can be used."</p>
<p>The court was satisfied "that the current policy does not adequately explain the conditions under which the police can exercise discretion and therefore does not have the required legal quality".</p>
<p>In their decision, Sir Terence Etherton, President of the Queen & # 39; s Bench Division, Dame Victoria Sharp and Lord Justice Singh noted that the use of AFR is proportionate under human rights law as the potential benefits outweigh the AFR as the potential benefits the effects on Mr. Bridges outweigh the effects.</p>
<p>The court also concluded that a privacy impact assessment of the system was inadequate and that the armed forces had not done everything they could to verify that the AFR software “did not exhibit unacceptable racial or gender bias”.</p>
<p>The verdict found that there was no clear evidence that the software was race or gender biased.</p>
<p>The judges hoped that “since AFR is a novel and controversial technology, any police force intending to use it in the future would like to satisfy themselves that everything reasonable has been done to ensure that the software used has not been racial or gender based Tendency ".</p>
<p>At a three-day appeal in June, lawyers for Mr. Bridges argued that facial recognition technology interferes with privacy and privacy laws and is potentially discriminatory.</p>
<p>AFR is used by police in South Wales to capture live facial biometrics of large numbers of people and compare them to people on a "watch list" that may include suspects, missing and interested people.</p>
<p>The facial biometric data of people whose picture was captured on video surveillance but did not match will not be retained.</p>
<div class=

Face recognition technology is also used by the Metropolitan Police.

Mr Bridges' case was dismissed in the High Court last September by two senior judges who concluded that the use of the technology was not unlawful.

Mr Bridges, who the troop confirmed was not a person of interest and was never on a watchlist, crowdfunded his legal action and is supported by civil rights organization Liberty, which advocates banning the technology.

Megan Goulding, a Liberty attorney, said, "This ruling is a great victory in the fight against discriminatory and oppressive facial recognition."

Source link

Be the first to comment

Leave a Reply

Your email address will not be published.