Home » Technology » Users will soon be able to report nude photos to Apple

Users will soon be able to report nude photos to Apple

Apple is integrating a reporting function for nude content into the crypto messenger iMessage. As of iOS 18.2, the first users will be able to report photos and videos locally recognized as nude content by the operating system directly to Apple, as the Guardian reports. This is part of the Communication Security feature built into all Apple operating systems, which is now enabled by default for children under 13. Parents can also set up the function on teenagers’ devices and optionally activate it for themselves – or turn it off.

Advertising

If the system detects a nude photo received via iMessage, it is automatically blurred out and the recipient is warned about the sensitive content. There will be a new function at this point in the future to transmit the received recordings to Apple. In addition to the respective photos or videos, “restricted surrounding text messages” and the whistleblower’s name or account are also sent to the company, according to an iPhone notification dialog published by the Guardian. Apple will review the content and possibly take action, it says. This includes blocking iMessage accounts and, if necessary, informing law enforcement authorities.

The new feature will first be rolled out in Australia, as new regulatory rules for messaging and cloud services will soon apply there, the Guardian notes. However, the global rollout of the new reporting function is planned.

Apple originally opposed the bill in Australia (and other countries) on the grounds that it threatened the confidentiality of communications through end-to-end encryption. The law now gives providers more leeway to report illegal content – without a backdoor for encryption.

In order to better combat child sexual abuse material (CSAM), Apple considered scanning iCloud photos locally on the iPhone a few years ago – and automatically sending CSAM content found in the background to Apple transmitted. After massive criticism from customers, security researchers and civil rights activists, the company stopped the project.

The child protection functions planned in parallel, such as the nude filter, were revised and finally integrated into the operating systems. Users may click on such unrecognizable images and end up viewing them anyway. From iOS 18 onwards, the screen timecode – which ideally only parents know – must also be entered on devices used by children under 13 years of age.

Apple has recently been accused from various quarters of not doing enough to combat CSAM, particularly in iCloud. According to the accusation, offensive material is also being distributed there via joint albums. A US class action lawsuit accuses the company of ignoring such material.

(Pound)

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.