Instagram Protects Young People: Testing Feature to Blur Messages Containing Nudity
Instagram has begun testing a new feature that will automatically blur messages that contain nudity in order to protect its young users and prevent fraud attempts. The move aims to improve security on the app and strengthen efforts to combat harmful content.
Instagram continues to take new measures to ensure the safety of its users and minimize negative impacts, especially on young people. The popular social media platform, which operates under the umbrella of Meta, is working on a feature that automatically blurs messages containing nudity.
This feature was developed to filter harmful content and protect young people from potential threats.
How Does Nudity Detection and Blurring Work?
This new feature will use machine learning technology to analyze the content of submitted images. If an image contains nudity, it will be automatically blurred, preventing young users from being exposed to such content.
Meta will enable this feature by default for users under 18 and encourage adult users to enable it as well.
Full Protection with End-to-End Encryption
According to Meta, the analysis of images will be performed directly on devices. This means that nudity protection will be enabled even in end-to-end encrypted chats.
This means that Meta will not have access to users’ private messages; the content of the images will remain completely private unless users choose to share this information with Meta.
New Measures Against Sextortion Scams
Meta also announced that it is developing new technologies to protect its users against the type of scams known as sextortion, which often attempt to extort money through sexually explicit threats. In this context, pop-up messages that can identify potentially threatening accounts and warn users who interact with such accounts are being tested.
Future Plans and Security Measures
Instagram’s direct messages are not currently encrypted, but there are plans to expand Meta’s encryption features. This will further increase the security of the platform and help protect users’ private information.
Legal pressures in Europe and the US
Meta is under pressure in both the United States and Europe due to allegations that its apps have addictive effects and cause mental health problems among young people.
In particular, the attorneys general of states such as California and New York have sued Meta for misleading the public about the dangers of the platform. In Europe, the European Commission has requested detailed information from Meta on child protection.