A decision made by the Orlando, Florida Police Department appears to be a victory for privacy advocates. The city announced yesterday police have stopped using Amazon’s facial recognition program after 15 months of testing. But due to bandwidth problems the program never went live.
Orlando says it will stop testing Amazon's facial-recognition tool. The city wanted Rekognition to alert if anyone on a police watchlist passed one of their surveillance cameras, but they never made "any noticeable progress" https://t.co/SUJxJZk3W2
— Drew Harwell (@drewharwell) July 18, 2019
The program was designed to find suspects by scanning public spaces with facial recognition software. The ACLU was among those who criticized the program, but Mayor Buddy Dyer says the decision not to pursue the program was based on a lack of resources.
The technology is designed to automatically identify and track suspects in real-time using facial recognition algorithms.
“At this time, the city was not able to dedicate the resources to the pilot to enable us to make any noticeable progress toward completing the needed configuration and testing,” Orlando’s Chief Administrative Office said in a memo to City Council, adding that the city has “no immediate plans regarding future pilots to explore this type of facial recognition technology.”
The American Civil Liberties Union brought Orlando’s pilot with Amazon to light in May 2018, triggering nationwide pushback from civil rights groups concerned about privacy breaches, the technology’s potential for abuse and its tendency to misidentify dark-skinned women more frequently than white men.
Plus there is still the uncertainty over whether the controversial face-scanning technology “actually works.”