In a recent blog post, Instagram announced that it would implement new security features designed to protect its younger community of users.
With the intent of shielding teenagers from inappropriate adult interactions, Instagram is rolling out a new feature that prohibits adults from sending direct messages to users under the age of 18.
This feature is based on experience with machine learning algorithms to predict people’s ages and the ages users provide when they sign up.
In addition, to prevent conversations between adults and teens, Instagram will start using reminders or safety notices to encourage teenagers to be careful when engaging in conversations with strangers.
These safety notices will alert teenage users in their DMs when an adult has been displaying potentially suspicious behaviour.
For instance, if an adult sends a large amount of friend or message requests to users under the age of 18, Instagram will notify the recipients and give them the option “to stop the conversation, block, report or restrict the adult.”
Adults who have been exhibiting potentially suspicious behaviour will be restricted to see teen accounts in ‘Reels,’ ‘Explore,’ or ‘Suggested Users.’
In another step to enhance safety, Instagram is encouraging teens to make their accounts private.
This setting offers more protection as it enables them to better control who sees and interacts with their content.
As part of the sign-up process, an extra feature is added to give teenagers the option of making their account public or private.
Even if a teenager opts for a public account, Instagram will notify them, highlighting a private account’s benefits.
Finally, Instagram has partnered with The Child Mind Institute and ConnectSafely and published a new Parents Guide to help parents navigate discussions with teens about their online presence.