Instagram Introduces Automatic Image Blurring Feature
Meta announced the development of AI-driven tools designed to combat sextortion scams.
Meta, the parent company of Instagram, has revealed plans to implement new measures aimed at shielding adolescents from potential harm on its platform.[1]
The move comes amidst growing criticism, particularly from US lawmakers, regarding the negative impact of social media on the mental well-being of young individuals. Instagram, in particular, has faced scrutiny for its alleged role in facilitating harmful behaviors among teenagers.
In an official statement, Meta announced the development of AI-driven tools designed to combat sextortion scams targeting minors on Instagram's messaging service. Among these tools is an automatic nudity protection feature, which will identify and blur images containing nudity that are sent to users under the age of 18.
Additionally, Meta plans to accompany blurred images with messages containing advice and safety tips for both the sender and recipient of such content.
The company previously pledged to enhance protections for individuals under the age of 18 following legal action from numerous US states, accusing Meta of capitalizing on the distress of children.