Meta on Thursday unveiled a series of steps it’s taking to protect young people on its Instagram platform against “financial sextortion.”


What You Need To Know

  • Meta on Thursday unveiled a series of steps it’s taking to protect young people on its Instagram platform against “financial sextortion"

  • Sexual extortion is a form of blackmail when someone threatens to share nude or sexual images of a victim unless they meet their demands, such as sending money

  • Meta announced it is launching a public service announcement campaign on Instagram to educate teenagers and young adults about how sextortion scams work and what to do if they have been targeted

  • The social media giant is also rolling out a suite of new features designed to prevent sextortion

Sexual extortion is a form of blackmail when someone threatens to share nude or sexual images of a victim unless they meet their demands, such as sending money.

According to the National Center for Missing and Exploited Children, there were nearly 27,000 reports of sextortion in 2023, up more than 300% from 2021.

To combat the problem, Meta announced it is launching a public service announcement campaign on Instagram to educate teenagers and young adults about how sextortion scams work and what to do if they have been targeted.

The social media giant is also rolling out a suite of new features designed to prevent sextortion. They include blocking users who have shown signs of potentially “scammy behavior” from viewing a user’s follower or following lists and other interactions, preventing them from identifying someone with whom they can share damaging images of the victim.

Instagram users whose activity raises red flags with Meta may also be blocked from making follow requests. And Instagram will no longer allow people to screenshot or screen record “view once” photos or videos sent in private messages.

Meanwhile, Meta is testing new safety notices that would let a user know when they are chatting with someone who may be based in a different country.

On the PSA campaign, Meta has partnered with the National Center for Missing and Exploited Children and Thorn, a nonprofit organization that builds technology to defend children from sexual abuse. 

“Campaigns like this bring much-needed education to help families recognize these threats early,” John Shelton, a senior vice president for the NCMEC, said in a statement.

The video warns teens and young adults to be on the lookout for users who come on too strongly, ask to trade photos or request that their conversation be moved to another app. It also says targeted users can take back control of the situation by not responding, reporting the account and not paying.

Acknowledging that victims may be too embarrassed or scared to ask for help, the campaign also tries to reassure them that sextortion is not their fault.

The video directs teens to instagram.com/preventsextortion, which includes tips for teens affected by sextortion scams, a link to the NCMEC’s Take It Down tool, which can help someone get nude or sexually explicit photos taken of them before they turned 18 removed from the internet.

And the PSA points teens to the Crisis Text Line for live chat support.

“Our research at Thorn has shown that sextortion is on the rise and poses an increasing risk to youth,” Kelbi Schnabel, senior manager at Thorn, said in a statement. “It’s a devastating threat – and joint initiatives like this that aim to inform kids about the risks and empower them to take action are crucial.”

Meta said it’s also turning to Instagram content creators popular with teens and parents to help raise awareness about the scams.

The company said last week it removed 800 Facebook Groups and 820 accounts affiliated with the Yahoo Boys, a group that attempts to organize, recruit and train sextortion scammers. In July, Meta said it removed around 7,200 Facebook assets engaged in similar behavior.

In April, Meta announced it was testing a new feature, enabled by default for Instagram users under 18, that blurs images sent in direct messages that it detects contain nudity. That feature is now being rolled out globally.

Last month, Instagram launched Teen Accounts, which default to stricter message settings, including that that users under 18 cannot receive messages from anyone they don’t follow or are not connected to. Users younger than 16 need a parent’s permission to switch to less protective settings.