After settling lawsuit, Snapchat adds new parental controls for teens

Written on 01/22/2026

Snap adds new monitoring tools for parents to its Family Center, including ways to monitor a teen's app use and contacts.A phone screen displays the Family Center home screen on the Snap app.

New ways to monitor your teen's phone use are coming to Snapchat, as the app adds new screen time and contact monitoring tools for parents.

Starting today, parents and guardians linked to teen accounts will be able to see a weekly breakdown of the average amount of time users spent on the app, as well as the types of activity the teen engages in on the app, including chatting, taking pictures, or scrolling through their Snap Map. Parents will also be able to view additional details about their teen's new contacts, such as mutual friends lists and the Snap communities they've joined.

Snap launched its Family Center parental monitoring hub in 2022, and has debuted additional safeguards for users, including content and AI restrictions, friends list visibility, and location alerts as it cracks down on inappropriate content and predatory behavior by adult users.

"Family Center is designed to reflect the dynamics of real-world relationships by providing visibility into what teens are doing and allowing parents to adjust key settings, without showing the content of their private conversations," wrote Snap in a press release regarding the new parental controls. "We work diligently to protect teens on our platform while giving parents and caregivers the tools to play an active role in their teen’s experience on Snapchat."

Just yesterday (Jan. 21), the social media giant avoided a trial by jury and settled a lawsuit brought forth by a 19-year-old user who alleged the platform's algorithm — and those of its competitors, including Meta, YouTube, and TikTok — is dangerously designed to foster addictive behavior and mental health issues. Snapchat employees had previously warned of mental health risks to young users, court documents revealed. The case follows a similar pattern found among social media platforms like Instagram and Snapchat, which have faced an onslaught of lawsuits accusing the companies of not doing enough to protect young users, even when flagged by internal leadership.

Last year, the platform joined other companies, including the embattled Roblox, backing the 2025 Take It Down Act, aimed at providing legal recourse for victims of non-consensual intimate imagery (NCII) and deepfakes. The company has previously partnered with the National Center on Sexual Exploitation (NCOSE) and the National Center for Missing and Exploited Children (NCMEC).