Instagram defends new teen safety features after criticism
Instagram has defended new features aimed at protecting teens from sextortion attempts on the platform, following criticism they do not go far enough.
Parent company Meta said on Thursday its new tools – which include preventing screenshots or screen-recordings of disappearing images and videos – were part of “ongoing efforts” to stop criminals tricking teens into sending intimate images to scammers.
The NSPCC said the moves were a “step in the right direction”.
But Arturo Béjar, former Meta employee turned whistleblower, told BBC News there were easier ways Instagram could protect young people from unwanted contact.
“The most impactful thing they could do is make it easy for a teen to flag when they think the account asking to follow them is pretending to be a teen,” Mr Béjar said.
“The way the product is designed, by the time they need to report for sextortion the damage is already done.”
Meta said its tools, developed using user feedback, give teens clear and straightforward ways to report inappropriate behaviour or harassment.
It said it also offers dedicated mechanisms for flagging unwanted nude images and prioritises such reports, adding it is inaccurate to suggest people cannot report accounts pretending to be teens as it has options to report fraud or scams.
Richard Collard, the NSPCC’s associate head of child safety online policy, said: “Questions remain as to why Meta are not rolling out similar protections on all their products, including on WhatsApp where grooming and sextortion also take place at scale”.
The UK’s communications watchdog Ofcom warned that social media companies will face fines if they fail to keep children safe.
What is sextortion?
Sextortion, which sees scammers trick people into sending sexually explicit material before blackmailing them, has become a dominant form of intimate image abuse.
Law enforcement agencies around the world have reported a rise in the number of sextortion scams taking place across social media platforms, with these often targeting teenage boys.
The UK’s Internet Watch Foundation said in March that 91% of sextortion reports it received in 2023 related to boys.
The shame, stress and isolation felt by victims of sextortion crimes, who are harassed and told their images will be shared publicly if they do not pay blackmailers, has led some to take their own lives.
Parents of teenagers who have died after being targeted have called on social media firms to do more to stop it.
Ros Dowey, the mother of 16-year-old Murray Dowey, who died by suicide in 2023 after being targeted by a sextortion gang on Instagram, previously told the BBC that Meta was not doing “nearly enough to safeguard and protect our children when they use their platforms”.
‘Built-in protections’
Meta said its new safety features and campaign are designed to build on tools already available to teens and parents on the platform.
Antigone Davis, Meta’s head of global safety, said a new Instagram campaign aims to give children and parents information about how to spot sextortion attempts in case perpetrators evade its tools for detecting them.
“We have put in built-in protections so that parents do not have to do a thing to try and protect their teens,” she told BBC News.
“That said, this is the kind of adversarial crime where whatever protections we put in place, these extortion scammers are going to try and get around them.”
It will hide people’s follower and following lists from potential sextortion accounts, and let teens know if they are speaking to someone who seems to be in a different country.
Sextortion expert Paul Raffile told the BBC in May that sextorters try to find teen accounts in following and follower lists after searching for high schools and youth sports teams on platforms.
Instagram will also prevent screenshots of images and videos sent in private messages with its “view once” or “allow replay” mechanisms – which can be selected by users when sending an image or video to others.
Users will not be able to open these forms of media at all on Instagram web.
But Mr Béjar said it could give people “a false sense of security” as attackers could photograph an image on a screen using a separate device.
According to Meta, the feature goes beyond protections offered by other social media platforms that tell users when their images or videos have been screenshotted, but do not prevent it.
Mr Béjar – who has called on the platform to create a button that lets teens straightforwardly report inappropriate behaviour or contact – also said nude images sent to younger teens should be blocked, not just blurred.
He added that younger users should have clearer, stronger warnings about sending such images than those currently offered.
Meta says its nudity protections were designed in liaison with child protection experts to educate people about the risks of seeing and sharing such images in a way that does not shame or scare teens by disrupting conversations.
The company is currently moving under-18s into Teen Account experiences on Instagram with stricter settings turned on by default – with parental supervision required for younger teens to turn them off.
But some parents and experts have said safety controls for teen accounts shift the responsibility of spotting and reporting potential threats onto them.
Dame Melanie Dawes, the chief executive of the regulator Ofcom, told the BBC said it was the responsibility of the firms – not parents or children – to make sure people were safe online ahead of the implementation of the Online Safety Act next year.