Apple has stopped its plans to scan images sensitive to child abuse on iPhones with artificial intelligence, amid criticism.
Apple’s; We have recently shared with you that it aims to install software that scans images on iPhone model mobile phones it produces for the detection of child abuse.
According to the statements made, the system will scan the photos uploaded to iCloud, detect the content related to the exploit and report it to the security forces. news will give.
The software, which will review all the photos that will be uploaded to iCloud, caused the reaction of many people who thought that this meant an invasion of privacy.
ASKIYA ALINDI
Apple, in a statement on its website, the use of the features they have developed to prevent the sexual abuse of children with communication tools and the spread of such materials; stated that they postponed due to comments from customers, researchers and advocacy organizations.
In the statement, it was emphasized that the system was not completely removed from the table, but the launch was postponed in order to improve it in line with the criticisms.
HOW THE SYSTEM WILL WORK
Law enforcement has archives where known images of child sexual abuse have been converted into digital codes. With these codes, it can be understood whether any photo matches the images in that archive.
Apple also plans to capture potential child abuse photos from iPhones using a similar archive system called NeuralHash.
As soon as a user uploads a photo to Apple’s storage service iCloud, that photo will be converted into codes and compared with the ones in the archive, and it will be possible to detect whether it contains exploitation without looking at the photo with the naked eye, and it will be reported to the security forces. news will give.
The feature in question was expected to be used in some countries with iOS 15. However, it is not known when it will be activated after Apple’s postponement decision.
–