Apple on Thursday unveiled new tools to better identify sexual images involving children on its iPhones, iPads and iCloud server in the United States, sparking concerns from internet privacy advocates.
• Read also: Apple does better than expected, driven by iPhone sales
“We want to help protect children against predators who use communication tools to recruit and exploit them, and limit the dissemination of child pornography,” said the group on its site.
To do this, Apple plans to use cryptographic tools to compare photos uploaded to its iCloud server with those stored in a file maintained by the National Center for Missing and Exploited Children (NCMEC).
The group ensures that it does not have direct access to the image.
When a photo looks similar to the one in the file, Apple will manually go check it, disable the user’s account if necessary and send a report to the Center.
The group also plans to scan images sent or received via the iMessage messaging service on children’s accounts linked to a family subscription.
When explicit photos are spotted, they will then be blurred and the child will be referred to prevention messages before they can possibly be opened or sent.
Parents can, if they choose, choose to receive a message when their child receives or sends such photos.
The voice assistant Siri will also be trained to “intervene” when users search for child pornography images by warning them that these contents are problematic.
These tools will be phased in with upcoming operating system updates for iPhones, iPads, iWatch, and iMacs in the United States.
These changes “mark a significant departure from long-established privacy and security protocols,” said the Center for Democracy and Technology (CDT).
“Apple is replacing its end-to-end encrypted messaging system with a surveillance and censorship infrastructure, which will be vulnerable to abuse and abuse not only in the United States, but around the world,” said Greg Nojeim of CDT in a message sent to AFP.
The group should according to him “abandon these changes and restore the confidence of its users in the security and integrity of their data stored on Apple devices and services.”
The computer giant has a reputation for defending the privacy of its customers in the face of pressure from certain authorities seeking to access user data in the name of the fight against crime or terrorism.
“The exploitation of children is a serious problem, and Apple is not the first technology company to change its position on the protection of privacy in an attempt to combat it,” said India McKinney and Erica Portnoy of the NGO for the protection of freedoms on the Internet Electronic Frontier Foundation (EEF).
But even developed with the best of intentions, a system designed to detect child pornography “opens the door to other abuses”, they fear in a blog post.
Apple just needs to tweak the settings a bit to find other types of content or scan accounts not only of kids, but of everyone, they explain.
–