Snapchat says it’s working to make its app even safer for teen users.
Parent company Snap said Thursday that it is rolling out a suite of new features and policies aimed at better protecting 13- to 17-year-old users, including restrictions on friend suggestions and a new system for removing age-inappropriate content. The company also launched a series of YouTube videos for parents about the features and an updated website laying out its teen safety and parental control policies.
The new features come amid increasing pressure on social media platforms by lawmakers, educators and parents to protect young users from inappropriate content, unwanted adult attention, illicit drug sales and other issues. A Snap executive testified alongside leaders from TikTok and YouTube in a fall 2021 Senate committee hearing about youth safety on social media, promising new tools to help parents keep their teens safe. And since then, Snapchat — like other platforms — has rolled out a variety of new teen safety and parental supervision tools.
Thursday’s announcement follows the launch last year of Snapchat’s Family Center, which offers parents more insight into who their teenagers are communicating with on the messaging app. The app’s other existing teen safety measures include prohibiting young users from having public profiles and having teens’ Snap Map location-sharing tool turned off by default.
As part of Thursday’s feature rollout, Snapchat will now require 13-to-17-year-old users to have a greater number of mutual friends in common with another account before that account will show up in Search results or as a friend suggestion, in an effort to avoid teens adding users on the app who they don’t know in real life. The app will also send a pop-up warming to teens if they are about to add an account that doesn’t share any mutual Snapchat friends or phone book contacts.
“When a teen becomes friends with someone on Snapchat, we want to be confident it is someone they know in real life — such as a friend, family member, or other trusted person,” the company said in a blog post.
Snapchat will also impose a new strike system for accounts promoting content inappropriate for teens in its Stories and Spotlight sections, where users can share content publicly on the app. If inappropriate content is reported or detected by the company, it will immediately remove the content and issue a strike against the poster’s account. If a user accrues “too many strikes over a defined period of time, their account will be disabled,” the platform says, although it does not lay out how many strikes would lead to a suspension.
Teen users will also start to see in-app content aimed at educating them on online risks such as catfishing and financial sextortion — when someone persuades a victim to share nude photos and then blackmails them for money — and letting them know what to do if they see it, including providing hotlines to contact for help. The PSA-style content will be featured on Snapchat’s Stories platform and in response to certain search terms or keywords.