قالب وردپرس درنا توس
Home / Technology / Face Detection Software Incorrectly Identifies 28 Regulators As Misconceptions: NPR

Face Detection Software Incorrectly Identifies 28 Regulators As Misconceptions: NPR



The American Civil Liberties Union states that Amazon Recognition, Face Detection Software sold online, inaccurate identified legislators and endanger civil rights ̵

1; accusing Amazon of refusing.

Leon Neal / AFP / Getty Images


hide text

captions

Leon Neal / AFP / Getty Images

The American Civil Liberties Union states that Amazon Recognition, Face Detection Software is sold online, inaccurate identified legislators and endangers civil rights – charges like Amazon denies.

Leon Neal / AFP / Getty Images

Face Detection Software, sold by Amazon, wrongly identified 28 members of Congress as people arrested for crimes, announced the American Civil Liberties Union on Thursday.

Amazon Recognition has been marketed as a tool that provides extremely accurate face analysis through images and video.

ACLU tested this claim using the software to scan images of every current member of the House and the Senate in a database that the watchdog was built from thousands of publicly available arrest pictures.

"The Congressmen who were falsely adapted to the mugshot database we used in the test include Republicans and Democrats, Men and Women, and Legislators of all ages, from across the country." The ACLU declared.

Testing misidentified people of high-speed color – 39 percent – although they accounted for only 20 percent of the congress. A member who is falsely quoted as a crime suspect was Rep. John Lewis, D-Ga., Who first became prominent as civil rights director.

As part of the test, ACLU said it used Amazon's default settings for campaigns. 19659008] But a spokeswoman for Amazon Web Services said in an e-mail that ACLU should have changed these settings – and used a higher "threshold" or percentage that measures how Safe Recognition is in finding a match.

"While 80% trust is an acceptable limit for images of sausages, chairs, animals or other social media use cases, it would not be appropriate to identify people with a reasonable degree of security," she said. For law enforcement, Amazon "guides customers" to set the threshold to 95 percent or higher.

ACLU in northern California lawyer Jacob Snow responded to that comment in an email statement: "We know from our test that Amazon does not do anything to ask users what they are using Recognition for," he said.

Snow does not believe that changing the threshold changes the danger: "Surveillance technology in the government's hands is founded for abuse and raises serious civilian rights."

Outbreaks from privacy and civil rights groups have not stopped law enforcement from pursuing technology. Orlando, Fla., Police Force tested Recognition Real-Time Surveillance. Washington County Sheriff's Office, near Portland, Ore., Has used it to seek faces from photos of suspicion taken by deputies.

"This is in part a result of vendors pushing face recognition technology because there will be another avenue of revenue," said Jeramie Scott, National Security Advisor at the Electronic Privacy Information Center in Washington, DC, to NPR . He compared facial recognition software to body cameras worn by the police, which could be used for police or increasingly public monitoring.

He stressed the need for debate so technology would not be a bad solution to bad politics. "Because of the disproportionate error rate and due to the real risk of depriving civil liberties due to face recognition technology, we must have a conversation about how and when and under what circumstances this technology will be used by police authorities, if at all. "


Source link