Meta announces new tools to tackle 'sextortion' and intimate image abuse

By Martyn Landi, PA Technology Correspondent
Meta has unveiled a range of new safety features designed to protect s, in particular young people, from “sextortion” and intimate image abuse.
The social media giant – which owns Facebook, Instagram and WhatsApp – has confirmed it will begin testing a nudity filter in Direct Messages (DMs) on Instagram.
Called Nudity Protection, this feature will be on by default for those aged under 18 and will automatically blur images sent to s which are detected as containing nudity, better protecting s from seeing unwanted nudity in their DMs.
When receiving nude images, s will also see a message urging them not to feel pressure to respond, and an option to block the sender and report the chat.
With the filter turned on, people sending images containing nudity will also see a message reminding them to be cautious when sending sensitive photos, and be given the chance to unsend these photos.
The tool uses on-device machine learning to analyse whether an image contains nudity, meaning it will work inside end-to-end encrypted chats, and Meta said it will only see any images if a chooses to report them to the company.
“Financial sextortion is a horrific crime,” Meta said in a blog post on the updates.
“We’ve spent years working closely with experts, including those experienced in fighting these crimes, to understand the tactics scammers use to find and extort victims online, so we can develop effective ways to help stop them.
“Today, we’re sharing an overview of our latest work to tackle these crimes. This includes new tools we’re testing to help protect people from sextortion and other forms of intimate image abuse, and to make it as hard as possible for scammers to find potential targets – on Meta’s apps and across the internet.
“We’re also testing new measures to young people in recognising and protecting themselves from sextortion scams.”

Elsewhere, the social media giant said it was testing new detection technology to help identify s potentially engaging in sextortion scams and limit their ability to interact with everyone, but especially younger s.
Meta said message requests from these suspicious s would be routed straight to a ’s hidden requests folder.
For younger s, suspicious s will no longer see the “Message” button on an teenager’s profile, even if they are already connected, and the firm was testing hiding younger s from these s in people’s follower lists to make them harder to find.
Meta added that it was also testing new pop-up messages for people who may have interacted with such s – directing them to and help if they need it.
In addition, the company said it was expanding its work with other platforms to share details about s and behaviours that violate child safety policies as part of the Lantern programme created last year.
“This industry cooperation is critical, because predators don’t limit themselves to just one platform – and the same is true of sextortion scammers,” Meta said.
“These criminals target victims across the different apps they use, often moving their conversations from one app to another.
“That’s why we’ve started to share more sextortion-specific signals to Lantern, to build on this important cooperation and try to stop sextortion scams not just on individual platforms, but across the whole internet.”
Lantern is a programme run between different tech companies which shares information about suspicious s.