The scan for nude photos works on devices of underage users that are controlled by a parent. By default, the function is turned off, parents who want it must turn on the scan. From then on, iMessage (the Messages app) will scan the child’s iPhone for nude photos. It concerns both photos that are received and photos that the child wants to send.
The scan is emphatically performed only by the iPhone itself, within iMessage’s end-to-end encryption. Apple itself has no insight or influence over the process. Received photos are blurred, but the child can still decide to view them. Detected nude photos can also still be sent.
However, children are first shown an explanation, which explains, for example, the harmful consequences of forwarding nude images. Parents will not be notified if the child views or sends the photos, but children do get a quick button to their parents if they want to talk.
Controversial photo library scan deleted
The feature will be rolled out in the Netherlands as of today. The scan function was already made available in the US at the end of 2021, followed by other English-speaking countries. As of today, the feature is available in fifteen countries. The function will be gradually made available and may not yet be available to all Dutch users.
Earlier, Apple also announced a scanning feature that went further, where iPhones would automatically scan each user’s photo library for child abuse images. That function was subject to so much criticism both externally and internally that the function was postponed. Apple has since scrapped the controversial feature altogether.