Resisting the rise of facial recognition

Europe’s data-protection rules say that police can process data for biometric purposes if it’s necessary and subject to appropriate safeguards. A key question here, says Fussey, is whether it would be proportionate to, for example, put tens of thousands of people under video surveillance to catch a criminal.

So far, British judges have suggested they think it might be, but only if the use of the technology by police has tighter controls. Last year, a man named Ed Bridges sued police in South Wales, alleging that his rights to privacy had been breached because he was scanned by live facial-recognition cameras on two occasions in Cardiff, UK, when police were searching crowds to find people on a watch list. In August, a UK court ruled that the actions were unlawful: police didn’t have enough guidance and rules about when they could use the system and who would be in their database, and they hadn’t sufficiently checked the software’s racial or gender bias. But judges didn’t agree that the camera breached Bridges’ privacy rights: it was a ‘proportionate’ interference, they said.

The EU is considering an AI framework that could set rules for biometrics. This year, a white paper — a prelude to proposed legislation — suggested that special rules might be needed for ‘high-risk’ AI, which would include facial recognition. Most people and firms who wrote into a consultation that followed the document felt that further regulations were needed to use FRT in public spaces.

Ultimately, the people affected by FRT need to discuss what they find acceptable, says Aidan Peppin, a social scientist at the Ada Lovelace Institute. This year, he has been helping to run a citizens’ biometrics council, featuring in-depth workshops with around 60 people across the country. Participants provide their views on biometrics, which will inform a UK review of legislation in the area. “The public voice needs to be front and centre in this debate,” he says.

References

  1. ^ thousands of government procurement notices (www.chinafile.com)
  2. ^ it still has inaccuracies and racial biases (www.nature.com)
  3. ^ as with the surveillance in China’s Xinjiang province (www.nature.com)
  4. ^ asked more than 1,000 Londoners (www.policingethicspanel.london)
  5. ^ The New York Times revealed (www.nytimes.com)
  6. ^ unveiled a piece of software (sandlab.cs.uchicago.edu)
  7. ^ nationally representative survey (www.adalovelaceinstitute.org)
1 2 3 4 5 6 7 8 9 10

Share