Breaking News
Home / Technology / Apple is said to be scanning images stored on iPhones and iCloud for images of child abuse

Apple is said to be scanning images stored on iPhones and iCloud for images of child abuse



Apple plans to scan photos stored on iPhones and iCloud for photos of child abuse, according to Financial Times. The new system can help law enforcement in criminal investigations, but can open the door to increased legal and governmental requirements for user data.

The system, called neuralMatch, will “proactively alert a team of human reviewers if it believes illegal images are being detected, which will then contact police if the material can be verified,” he said. Financial Times so. neuralMatch, which was trained using 200,000 images from the National Center for Missing & Exploited Children, is being rolled out first in the United States. Pictures will be tampered with and compared to a database of known pictures of child sexual abuse.

“According to people who are informed about the plans, every photo uploaded to iCloud in the US will receive a ̵

6;security certificate’, which states whether it is suspected or not,” said Financial Times so. “Once a certain number of images are marked as suspicious, Apple will allow all the suspected images to be decrypted and, if they appear to be illegal, passed on to the relevant authorities.”

Professor at John Hopkins University and cryptographer Matthew Green raised concerns about the system on Twitter Wednesday night. “This kind of tool can be a boon to finding child pornography in people’s phones,” Green said. “But imagine what it can do in the hands of an authoritarian government?”

“Even if you think Apple will not allow these tools to be misused [crossed fingers emoji] there is still a lot to worry about, ”he added. “These systems rely on a database of ‘problematic media notices’ that you as a consumer cannot go through.”

Apple already checks iCloud files against known images from child abuse, like all other major ISPs. But the system described here will go further, so that central access to local storage can be achieved. It would also be trivial to extend the system to crimes other than child abuse – a particular concern given Apple’s extensive operations in China.

The company informed some U.S. academics about it this week, and Apple may share more about the system “as soon as this week,” according to two security researchers who were briefed on Apple’s previous meeting. Financial Times reports.

Apple has previously predicted the privacy built into its devices, and was famous to the FBI when the agency wanted Apple to build a back door in iOS to access an iPhone used by one of the shooters in the 2015 attack in San Bernardino. The company did not respond to a request for comment Financial Times report.




Source link