Meta, the parent company of Instagram, announced on Thursday that it will pilot new features designed to blur images containing nudity in direct messages, aiming to protect teenagers and deter scammers.
This initiative arises as Meta faces increasing scrutiny in the U.S. and Europe, where it has been criticized for its apps’ addictiveness and their potential role in exacerbating mental health issues among youth.
The new safeguard on Instagram will utilize on-device machine learning to assess if incoming images in direct messages contain nudity. This feature will automatically activate for users under 18, and Meta plans to prompt adults to enable it as well.
“Because the images are analyzed on the device itself, nudity protection will also work in end-to-end encrypted chats, where Meta won’t have access to these images – unless someone chooses to report them to us,” Meta explained.
Currently, unlike Meta’s other services such as Messenger and WhatsApp, Instagram’s direct messages are not encrypted, although the company has disclosed plans to implement encryption in the future.
Additionally, Meta is working on technologies to detect accounts potentially involved in sextortion scams and is testing alerts for users who might have interacted with these suspect accounts.
Earlier in January, Meta committed to concealing more problematic content from young users on Facebook and Instagram, particularly targeting exposure to issues like suicide, self-harm, and eating disorders, to further safeguard its platform’s younger audience.
Instagram Tests Nudity-Blurring Feature to Protect Teens
I am working as a Sr. Digital Marketing Executive with four+ years of experience in the space of Digital Marketing. I assist in the formulation of strategies to build a lasting digital connection with consumers. I help in planning and monitoring the ongoing company presence on social media & Launch optimised online advertisements to increase company and brand awareness.
Leave a comment
Leave a comment