Meta is expanding its Teen Accounts feature to Facebook and Messenger, after first launching it on Instagram in September last year. The feature will initially be available in the U.S., U.K., Canada, and Australia, with plans to roll out to additional countries in the future.
Teen Accounts are designed to automatically provide built-in protections for users under the age of 16, including limits on who can message or interact with them and restrictions on sensitive content. These safeguards were first implemented on Instagram in response to growing criticism from U.S. lawmakers and health experts, who expressed concern over teen safety and mental health on social media platforms.
As part of the rollout, Meta confirmed to TechCrunch that teens will only receive messages from people they already follow or have previously messaged, while features like stories, tags, mentions, and comments will be restricted to friends and followers only. The company also introduced time-based reminders nudging teens to log off after an hour of usage per day, along with automatic Quiet Mode activation overnight.
Meta is further enhancing Instagram’s parental controls. Teen users under 16 will now require parental permission to turn off safety features like blurring nudity in DMs or going live on the platform.
These updates come amid rising scrutiny from U.S. health authorities, such as the Surgeon General, and various state governments that are beginning to implement laws requiring parental consent for minors on social media.
Meta reports that more than 54 million Instagram users have already been moved into Teen Accounts since the feature’s launch, and that 97% of teens aged 13 to 15 are keeping the protections enabled. Additionally, Meta shared data from an Ipsos survey it commissioned, showing that 94% of parents found Teen Accounts helpful, and 85% believe the feature supports positive experiences for teens on Instagram.