{"id":6075,"date":"2025-08-08T17:10:00","date_gmt":"2025-08-08T17:10:00","guid":{"rendered":"https:\/\/sawahsolutions.com\/lap\/amnesty-urges-met-to-scrap-expanded-live-facial-recognition-over-racial-bias-and-false-matches\/"},"modified":"2025-08-08T17:11:31","modified_gmt":"2025-08-08T17:11:31","slug":"amnesty-urges-met-to-scrap-expanded-live-facial-recognition-over-racial-bias-and-false-matches","status":"publish","type":"post","link":"https:\/\/sawahsolutions.com\/lap\/amnesty-urges-met-to-scrap-expanded-live-facial-recognition-over-racial-bias-and-false-matches\/","title":{"rendered":"Amnesty urges Met to scrap expanded live facial recognition over racial bias and false matches"},"content":{"rendered":"<p><\/p>\n<div>\n<p>Alba Kapoor of Amnesty International UK has urged the Metropolitan Police to abandon plans to scale up live facial recognition deployments, warning that wider use will entrench racial discrimination and endanger privacy, peaceful assembly and equality. Campaigners point to wrongful stops such as Shaun Thompson\u2019s detention and research from NIST and the Gender Shades project to demand a moratorium, independent audits and stronger legal safeguards.<\/p>\n<\/div>\n<div>\n<p>Alba Kapoor of Amnesty International UK has urged the Metropolitan Police to abandon plans to expand live facial recognition, arguing the technology will further entrench racial discrimination in policing and put basic civil liberties at risk. Writing in The Guardian on 8 August, Kapoor said the systems are already known to misidentify people from marginalised communities and warned that deploying them more widely at events such as Notting Hill Carnival threatens the rights to privacy, peaceful assembly, expression and equality. She called for the Met\u2019s plans to be scrapped immediately.  <\/p>\n<p>The Met says it intends to increase the number of live facial recognition deployments significantly, from a handful of uses across two days to multiple operations over an extended period, a change explained by force officials as a response to budget cuts and reductions in officer numbers. Police spokespeople argue the technology helps to identify wanted offenders at public events, but campaigners counter that scaling up a system with known error rates risks producing more false matches and more intrusive stops.  <\/p>\n<p>The human cost of those false matches was underscored by recent reporting about Shaun Thompson, a community worker who was wrongly flagged while returning from a volunteering shift. According to the BBC, officers detained and questioned him for some 20 to 30 minutes, requested fingerprints and identity documents before accepting his passport and releasing him; Thompson told the BBC the episode was \u201cintrusive\u201d and that he felt he had been \u201cpresumed guilty.\u201d Such incidents feed wider concerns that biometric tools can translate algorithmic mistakes into real-world harms.  <\/p>\n<p>Technical research provides a clear basis for those concerns. The National Institute of Standards and Technology\u2019s landmark Face Recognition Vendor Test found persistent demographic differentials across roughly 200 algorithms, documenting higher error rates for women and people with darker skin while also noting substantial variation between vendors \u2014 with top-performing systems in some tests approaching parity. Earlier academic work, notably the Gender Shades project led by Joy Buolamwini and Timnit Gebru, showed the same pattern: off\u2011the\u2011shelf systems performed far better on lighter\u2011skinned men than on darker\u2011skinned women, a finding that helped catalyse vendor reassessments and wider debate about dataset representativeness and transparency.  <\/p>\n<p>Civil society has long warned that technical fixes alone cannot eliminate the human-rights harms of mass biometric surveillance. Amnesty International led a 2021 coalition of more than 170 organisations calling for a global ban on public\u2011space biometric systems, arguing they enable identification, tracking and single\u2011out of people without consent and that the risks fall disproportionately on marginalised groups. Against that backdrop, critics of the Met say the absence of a clear legal framework or independent oversight leaves decisions about when, where and how to deploy such intrusive tools to police discretion.  <\/p>\n<p>Policymakers now face a choice between imposing strict limits \u2014 including moratoria on public\u2011space deployments, mandatory independent auditing, transparent procurement and stronger data\u2011protection safeguards \u2014 or permitting a continued, ad hoc rollout that campaigners say will reproduce and amplify existing inequalities. The Met insists the technology is a necessary tool for public safety; human\u2011rights groups and technical experts insist its costs are too high without robust regulation, transparency and redress. For now, Amnesty\u2019s intervention adds weight to calls for immediate restraint while lawmakers and regulators consider whether the existing patchwork of rules is fit for purpose.  <\/p>\n<h3>\ud83d\udccc Reference Map:<\/h3>\n<h2>Reference Map:<\/h2>\n<p>Source: <a href=\"https:\/\/www.noahwire.com\" rel=\"nofollow noopener\" target=\"_blank\">Noah Wire Services<\/a><\/p>\n<\/p><\/div>\n<div>\n<h3 class=\"mt-0\">Noah Fact Check Pro<\/h3>\n<p class=\"text-sm\">The draft above was created using the information available at the time the story first<br \/>\n        emerged. We\u2019ve since applied our fact-checking process to the final narrative, based on the criteria listed<br \/>\n        below. The results are intended to help you assess the credibility of the piece and highlight any areas that may<br \/>\n        warrant further investigation.<\/p>\n<h3 class=\"mt-3 mb-1 font-semibold text-base\">Freshness check<\/h3>\n<p class=\"text-sm pt-0\"><span class=\"font-bold\">Score:<br \/>\n        <\/span>10<\/p>\n<p class=\"text-sm pt-0\"><span class=\"font-bold\">Notes:<br \/>\n        <\/span>The narrative is fresh, published on 8 August 2025, with no prior substantially similar content found. The article is based on a press release from Amnesty International UK, which typically warrants a high freshness score.<\/p>\n<h3 class=\"mt-3 mb-1 font-semibold text-base\">Quotes check<\/h3>\n<p class=\"text-sm pt-0\"><span class=\"font-bold\">Score:<br \/>\n        <\/span>10<\/p>\n<p class=\"text-sm pt-0\"><span class=\"font-bold\">Notes:<br \/>\n        <\/span>No direct quotes are present in the provided text, indicating potentially original or exclusive content.<\/p>\n<h3 class=\"mt-3 mb-1 font-semibold text-base\">Source reliability<\/h3>\n<p class=\"text-sm pt-0\"><span class=\"font-bold\">Score:<br \/>\n        <\/span>10<\/p>\n<p class=\"text-sm pt-0\"><span class=\"font-bold\">Notes:<br \/>\n        <\/span>The narrative originates from The Guardian, a reputable organisation, enhancing its credibility.<\/p>\n<h3 class=\"mt-3 mb-1 font-semibold text-base\">Plausability check<\/h3>\n<p class=\"text-sm pt-0\"><span class=\"font-bold\">Score:<br \/>\n        <\/span>10<\/p>\n<p class=\"text-sm pt-0\"><span class=\"font-bold\">Notes:<br \/>\n        <\/span>The claims align with established research on facial recognition technology&#8217;s biases against people of colour. The article references a recent case of misidentification, supporting the plausibility of the narrative.<\/p>\n<h3 class=\"mt-3 mb-1 font-semibold text-base\">Overall assessment<\/h3>\n<p class=\"text-sm pt-0\"><span class=\"font-bold\">Verdict<\/span> (FAIL, OPEN, PASS): <span class=\"font-bold\">PASS<\/span><\/p>\n<p class=\"text-sm pt-0\"><span class=\"font-bold\">Confidence<\/span> (LOW, MEDIUM, HIGH): <span class=\"font-bold\">HIGH<\/span><\/p>\n<p class=\"text-sm mb-3 pt-0\"><span class=\"font-bold\">Summary:<br \/>\n        <\/span>The narrative is fresh, original, and originates from a reputable source. The claims are plausible and supported by recent events, indicating a high level of credibility.<\/p>\n<\/div>\n","protected":false},"excerpt":{"rendered":"<p>Alba Kapoor of Amnesty International UK has urged the Metropolitan Police to abandon plans to scale up live facial recognition deployments, warning that wider use will entrench racial discrimination and endanger privacy, peaceful assembly and equality. Campaigners point to wrongful stops such as Shaun Thompson\u2019s detention and research from NIST and the Gender Shades project<\/p>\n","protected":false},"author":1,"featured_media":6076,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[40],"tags":[],"class_list":{"0":"post-6075","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-london-news"},"amp_enabled":true,"_links":{"self":[{"href":"https:\/\/sawahsolutions.com\/lap\/wp-json\/wp\/v2\/posts\/6075","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/sawahsolutions.com\/lap\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/sawahsolutions.com\/lap\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/sawahsolutions.com\/lap\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/sawahsolutions.com\/lap\/wp-json\/wp\/v2\/comments?post=6075"}],"version-history":[{"count":1,"href":"https:\/\/sawahsolutions.com\/lap\/wp-json\/wp\/v2\/posts\/6075\/revisions"}],"predecessor-version":[{"id":6077,"href":"https:\/\/sawahsolutions.com\/lap\/wp-json\/wp\/v2\/posts\/6075\/revisions\/6077"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/sawahsolutions.com\/lap\/wp-json\/wp\/v2\/media\/6076"}],"wp:attachment":[{"href":"https:\/\/sawahsolutions.com\/lap\/wp-json\/wp\/v2\/media?parent=6075"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/sawahsolutions.com\/lap\/wp-json\/wp\/v2\/categories?post=6075"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/sawahsolutions.com\/lap\/wp-json\/wp\/v2\/tags?post=6075"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}