By Julie Jargon
When Instagram began fencing off teen accounts last year for safety reasons, content from people under 18 all but vanished for adults.
Teen accounts were automatically private, and posts and reels from those accounts no longer circulated in the Explore tab and main feeds, except in the case of adults who were already following teens. Predators suddenly had a lot harder time finding targets.
But two moms affiliated with the family-advocacy organization ParentsTogether Action discovered a workaround, which they shared with me: When a teen account comments on a public post or video reel, and an adult account that hasn't already been flagged for suspicious behavior sees it, the adult can chat up the teen in the comments and even send that teen a follow request. If the teen accepts, the two can engage in private direct messaging.
The two moms conducted six tests over a six-month period. In one instance, a test teen account commented on a public reel of a Taylor Swift singalong. The adult test account replied to the fake teen's comment, and asked for a follow. Once granted, the fake accounts could exchange direct messages, and even swap nude images, as in the sextortion cases that are plaguing social media.
"This test doesn't capture the full range of safety features that activate when there are actual signs of potential harm," said a spokeswoman for Meta Platforms, which owns Instagram. The system looks at factors including how new the adult account is and whether it's originating outside of the U.S.
"Accounts showing suspicious behavior -- including possible sextortion -- are blocked from requesting to follow teens. We warn teens if they interact with these accounts, and provide extra information about who they're messaging," the spokeswoman added.
In other words, Instagram is likely to flag a shady account that tries to connect with a teen, but other less suspicious accounts could make contact.
Warnings and restrictions
In recent years, sextortion has evolved into financially motivated scams in which adults -- often based outside the U.S. -- pose as teen girls to target boys. The perpetrator sends a boy a nude photo of a girl, pressuring him to send photos of himself in return. When the boy does, the perpetrator says the photos will be shared with the boy's social media followers if he doesn't pay a ransom.
These cases, which often start on Instagram and migrate to Snapchat or Apple Messages, can have devastating outcomes, with some boys taking their own lives.
Mary Rodee, whose son died by suicide after being targeted in a sextortion scam, insists the Instagram workaround is a concern. She has repeated similar iterations of the test, each time using the adult accounts to reply to the teen accounts' comments on different celebrity fan pages.
"Adults have the ability to send friend requests to teens by being in a shared online space," she said.
Making new and existing teen accounts private is a big step Instagram has taken toward protecting teens. Only people approved by teens can communicate with them and see what they post. People 15 and under can't change that setting without parental permission.
If teens try to create new accounts with an older birthdate, they will be prompted to either upload an ID or a video selfie for Instagram's face-based age-prediction tool. The company is also using AI to police existing accounts that have dubious birthdates.
Instagram alerts teen accounts that might already be in touch with suspicious actors. When teens are notified that they're chatting with someone in a different country, they also receive a warning that says requests for photos and money could be a scam. In June alone, teens blocked suspicious accounts a million times and reported another million, the company said.
Even if teens are interacting with accounts that haven't been deemed suspicious, the spokeswoman said, a "new" label appears on newly created account profiles. When teens start a new chat with someone, they can see the date someone joined Instagram and whether they have mutual friends.
Still, the loophole uncovered by ParentsTogether Action appears to allow unflagged adult accounts to find teen accounts and interact with them on public posts and reels to gain teens' trust.
Sharing nudes is easy
Rodee's son, Riley Basford, was extorted on Facebook in 2021 when he was 15 years old. He had received a follow request from what appeared to be a teenage girl. Instead, Rodee later learned, it was a man in the Ivory Coast who threatened to share the nude photo Riley had given. When faced with the sextortion threats, the boy took his own life.
Rodee wanted to know if Instagram's changes -- which are also in effect on Facebook -- could have protected her son.
When the test accounts shared nude images with one another, they initially appeared blurred in the teen account, but the teen account user could opt to view the photos. Instagram says its nudity protection feature -- on by default for teen accounts and including warnings about the dangers of sharing such images -- has encouraged teens to think twice. In June, more than 40% of blurred images received in direct messages remained blurred, the company says.
It's clear that Meta's moves to protect teens are working. But parents shouldn't be lulled into a false sense that their kids are perfectly safe. It's still important to pay attention to your teen's activities on social media and have frank, open discussions about the evils that might lurk behind the screens.
Write to Julie Jargon at Julie.Jargon@wsj.com
(END) Dow Jones Newswires
December 06, 2025 05:30 ET (10:30 GMT)
Copyright (c) 2025 Dow Jones & Company, Inc.
Comments