Home » today » Technology » Apple suspends phone scanning plan for child abuse

Apple suspends phone scanning plan for child abuse

US technology company Apple has suspended its plans to scan sensitive images of child abuse on iPhones with artificial intelligence, amid criticism.

Apple, in a statement on its website, stated that it has delayed the rollout of the features they have developed to prevent the sexual abuse of children with communication tools and the spread of such materials, due to comments from customers, researchers and advocacy organizations.

In the statement, it was emphasized that the system was not completely removed from the table, but the launch was postponed in order to improve it in line with the criticisms. However, no information was shared on when the features could be activated.

WHY THE DISCUSSION HAS BEEN

Apple’s announcement last month that it would scan users’ iCloud content for child-abusive images and videos caused controversy because it contradicted the company’s privacy policies.

Apple announced that in the first phase of its three-stage measures against child abuse, it will only scan sensitive content, called CSAM (Child Sexual Abuse Material), without viewing private correspondence, thanks to artificial intelligence technologies.

The system in question would provide the security forces to be warned when necessary by presenting the suspicious images about child abuse to real people for review.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.