Instagram is preparing changes to its nude image and video content to combat sexual harassment, including a feature that will automatically blur nudity in instant messages.
In particular, the company said on Thursday that it is testing the features as part of its campaign to combat sex fraud and other forms of “image abuse”, as well as to make it harder for criminals to contact teenagers, the Guardian reports.
Sexual blackmail involves persuading someone to post inappropriate photos online and then threatening to release the material if the victim does not pay money or perform sexual favors.
Scammers often use direct messages to request images with personal content. To combat this, it will soon begin testing a feature in DMs that will blur any images with nudity “and encourage people to think twice before sending nudity.”
“The feature is designed not only to protect people from seeing unwanted nudity in their private messages, but also to protect them from scammers who might send nude images to trick people into sending the their own personal images in return,” Instagram said.
The feature will be enabled by default globally for teens under 18. Adult users will receive a notification encouraging them to activate it. Nude images will be blurred with a notification, giving users the option to see it. They will also have an option to block the sender and report the conversation.
People sending nude messages will receive a message reminding them to be careful when sending “sensitive photos.” They will also be told that they can send the photos if they change their mind, but that there is a chance that others have already seen them.
Meta also owns Facebook and WhatsApp, but the nude blur feature won’t be added to messages sent on those platforms.
#Instagram #automatically #dim #nude #photos #messages #protect #minors