Apple really wants to remove any material that is related to child abuse from its ecosystem. This action has already led to the detection and prosecution of a user who had more than 2,000 images and videos in his iCloud account.
Although only now the topic is being addressed more, because it is a system that will be included in iOS 15 and iPadOS 15, Apple was already using it in iCloud Mail.
Many Apple fans were annoyed with the company plan to start monitoring photo uploads for iPhone and iPad, on iOS 15, of material related to child abuse (CSAM). However, Apple was already doing it on iCloud, more specifically on the email service.
The company already had the system in place to detect child pornography in iCloud email, although it did not do so in iCloud Photos or iCloud Drive.
In fact, it was precisely with the active service that the company detected that a doctor kept more than 2,000 images and videos in your iCloud account.
When did Apple start using the CSAM detection system?
Even though the company has only just announced the use of the system that will monitor image uploads for the iPhone and iPad in more detail, Apple has been using it for some time.
According to what was unveiled at CES in January 2020, the detection system Child Sexual Abuse Material ou CSAM, began to be used in iCloud Mail in 2019. The Cupertino giant says that the accounts that have CSAM content violate its terms and conditions, and will be disabled..
After the recent announcement, there was a chorus of voices disagreeing with the new system. However, it is in the full interest of those responsible for the screening to proceed.
Does the system preserve user privacy?
Apple ensures that all users will have complete privacy and security. The new Child Sexual Abuse Material Detection System (CSAM) uses machine learning to analyze image attachments and determine if a photo is sexually explicit.
A functionality is designed so that Apple does not have access the images.
Despite Apple's explanation, privacy advocates don't like this system. Some of Apple's own employees have expressed their concerns in this regard.
Rights organizations have urged Tim Cook to kill him before it even started to reach iPhones and iPads in the United States. However, CSAM monitoring is nothing new at Apple.
Like spam filters in email, our system uses electronic signatures to find suspected child exploitation. We validate each correspondence with individual analysis.
Accounts with child exploitation content violate our terms and conditions of service, and any accounts we find with this material will be disabled.
He referred to the Cupertino company.
Apple increases CSAM material tracking
The reasons behind Apple's decision to expand CSAM scanning are still not completely clear. But according to a conversation between Apple employees in 2020, discovered as part of allegations in the ongoing legal battle between Apple and Epic Games, anti-fraud official Eric Friedman described Apple as "the biggest platform for the distribution of child pornography".
Despite this claim, it is believed that the total number of CSAM cases discovered by Apple on iCloud Mail each year is "measured in the hundreds". That doesn't seem all that significant - although only one is totally unacceptable - considering that billions of Apple devices are used worldwide.
The expansion may have something to do with the fact that some of the Apple's rivals, including Microsoft, are also looking for CSAM content. Apple may have felt it doesn't look good "in the picture" if it doesn't move forward like other platforms are doing.
-