Instagram Enhances Safety Features to Protect Teen Users

Instagram recently rolled out a new wave of protective features to help protect their youngest and most susceptible users – teens – from online harassment. The social media platform, which is owned by Meta, will implement these changes to enhance user experience and address growing concerns about mental health and online safety among young people….

Lisa Wong Avatar

By

Instagram Enhances Safety Features to Protect Teen Users

Instagram recently rolled out a new wave of protective features to help protect their youngest and most susceptible users – teens – from online harassment. The social media platform, which is owned by Meta, will implement these changes to enhance user experience and address growing concerns about mental health and online safety among young people.

The new features will automatically shift accounts that focus mainly on kids into the app’s most stringent messaging defaults. This simple, proactive step goes a long way toward keeping unwelcome sex solicitations at bay and minimizing the threat of harassment. The platform will roll out the “Hidden Words” feature. With this powerful tool, we can help filter out abusive and misogynistic commentary to promote a safer environment for our teens.

Meta, too, recently announced new features targeted at Direct Messages (DMs) for Teen Accounts. This app experience more closely designed for a younger audience includes built-in protections that are automatically applied. The company has, for years, been concerned with protecting teens. In June, Instagram and Meta combined blocked over a million accounts per day for perceived safety issues. Moreover, they are now reporting a net increase of million accounts after being served safety warnings.

Just last month, more than 40% of blurred images posted via DMs on Instagram remained blurred. That’s a clear indication that thousands of users are making effective use of the platform’s protective measures. Other new features highlight when an account first joined Instagram by showing the month and year in the header of new chats. This new feature provides deeper context for your conversations.

We noted that Meta has ramped up efforts to address mental health issues related to social media use. The company is understandably eager to show it’s taking steps to address these concerns. These changes include new safety features being rolled out on Instagram and other platforms.

“These new features complement the safety notices we show to remind people to be cautious in private messages and to block and report anything that makes them uncomfortable – and we’re encouraged to see teens responding to them,” said a Meta representative.

Meta also recently released Restricted Teen Accounts on Facebook and Messenger. This step increases restrictions for teens and gives parents increased authority to have control over their children’s online communications.

“While these accounts are overwhelmingly used in benign ways, unfortunately, there are people who may try to abuse them, leaving sexualized comments under their posts or asking for sexual images in DMs, in clear violation of our rules,” added the spokesperson from Meta.

These new changes are a positive sign that Instagram is serious about protecting their young users and providing a safer online environment. By implementing stricter controls and adding filtering features, Instagram aims to shield teens from potential harm while allowing them to engage with the platform confidently.