Instagram will start testing a feature that blurs messages containing images of nude bodies. The main goal of this innovation is to protect teenagers and prevent contact with potential scammers.
Meta has announced that the protection feature for Direct on Instagram will use on-device machine learning to analyze whether the sent image contains a nude body. This feature will be enabled by default for users under 18. Meta will also inform adult users to encourage them to enable it.
"Since the images are analyzed on the device itself, protection against nude bodies will also work in end-to-end encrypted chats, where Meta will not have access to these images unless someone decides to report them to us," the company said.
Unlike Meta Messenger and WhatsApp applications, direct messages in Instagram are not encrypted, but the company stated that it plans to introduce encryption for this service.
Meta has also announced that it is developing technology to identify accounts that may potentially be involved in sexual extortion fraud. The company is testing new pop-up messages for users who may have interacted with such accounts.
Source: reuters
Comments (0)
There are no comments for now