The American Civil Liberties Union tested Amazon's face recognition system – and the results were not good. To test the accuracy of the system, ACLU scanned the faces of all 535 congressmen against 25,000 public mugshots, using Amazon's Open Recognition API. None of the congressmen were in the mugshot setup, but Amazon's system generated 28 fake matches, a result that ACLU says raises serious concerns regarding Recognition's use of the police.
"An identification – either accurate or not – can cost people their freedom or even their lives," said the group in a statement attached. Congress must take these threats seriously, hit the brakes and adopt a moratorium on law enforcement use of face recognition. "
Achieved by The Verge an Amazon spokesman attributed the results to poor calibration. ACLU tests were performed using the Standardization Confidence Standard of 80 percent ̵
"Although 80% trust is an acceptable limit for images of sausages, chairs, animals or other social media use cases, "the representative said," It would not be appropriate to identify individuals with a reasonable degree of security. "Recognition does not enforce this recommendation during the boot process, and there is nothing that prevents law enforcement from using the default setting.
Amazon's Recognition came prominence in May, when an ACLU report showed that the system was used by a number of law enforcement agencies, including a real-time approval pilot of the Orlando Police. Being sold as part of Amazon's Web Services Cloud offer, the software was extremely cheap, often costing less than $ 12 a month for a whole department. The Orlando pilot has since expired, although the department continues to use
The ACLU's last experiment was designed with a special eye on Recognition Partnership with the Washington County Sheriff's Department in Oregon, where images were compared to a database of many as 300,000 shots.
"It's not hypothetical , "said Jacob Snow, who organized the test for ACLU in Northern California." This is a situation where Recognition is already in use k. "
The test also showed signs of race bias, a long-term problem for many facial recognition systems. 11 of the 28 fake fighting misidentified colorants (about 39 percent), including civil rights law rep. John Lewis (D-GA) and five other members of the Congressional Black Caucus. Only twenty percent of the current members of Congress are people of color, indicating that fake prices affected color members with a significantly higher interest rate. Finding echo differences found by NIST Face Detection Provider Test, which has shown consistently higher failure rates for face recognition tests on women and African Americans.
Running faces to a database without the matches may seem like a recipe for errors, but it looks like the conditions that existing face recognition systems face everyday. The system used by the London Metropolitan Police produces as many as 49 fake matches for each hit, requiring the police to sort the false positives manually. What is more significant is how much the fake positives upgraded in reconnaissance tests, with more than five percent of the professional group triggering a fake match of any kind.
In practice, most face detection IDs will be confirmed by a human before they lead to something as concrete as an arrest – but critics say that even checking a person's identity can harm. "Imagine a policeman gets a fake fight for someone with a hidden weapon arrest," Snow says. "It's a real danger if this information is transferred to the officer during a stop. It's not hard to imagine that it gets fierce."
The test also raises concerns about how easy Recognition can be distributed without supervision. All ACLU data was collected from publicly available sources, including the 25,000 screen images. (The organization refused to mention the specific source of privacy, but many states treat spam as public records.) Amazon's system is also significantly cheaper than non cloud-based offers, and ACLU only charges $ 12.33 for the tests.