Meta has announced the global rollout of its Teen Accounts feature on Facebook and Messenger, extending protections that were previously limited to Instagram and available only in the U.S., U.K., Australia, and Canada. The expansion comes as social media platforms face growing pressure to safeguard young users online.
Teen Accounts, first launched on Instagram last fall, are designed to provide teens with a safer experience through built-in protections and parental controls. With the expansion, teens will automatically be placed into restricted settings that limit exposure to inappropriate content and unwanted contact.
For users under 16, parental approval is required to adjust account settings. Messaging is restricted to people teens already follow or have previously messaged, and only friends can view or reply to stories. Tags, mentions, and comments are also limited to friends or approved followers. In addition, teens will receive daily reminders to take breaks after an hour of use and will automatically be placed into “Quiet mode” overnight.

The announcement comes as concerns over youth mental health and online safety continue to grow. A recent whistleblower-led study found that teens can still encounter harmful content on Instagram, including posts about self-harm and sexual exploitation, despite being enrolled in Teen Accounts. Meta has disputed the findings, emphasizing its ongoing efforts to improve safety.
Alongside the expansion, Meta also launched its School Partnership Program in the U.S., allowing educators to directly report safety concerns such as bullying for faster review and removal on Instagram. Schools that participate will receive a banner on their Instagram profiles, signaling to parents and students that they are official partners.

The program, piloted earlier this year, received positive feedback from schools, and will now be open to all middle and high schools across the U.S. The initiative also provides access to educational resources designed to help schools and families address online safety.
Meta’s latest measures align with mounting calls from regulators and public health officials, including the U.S. Surgeon General, for social platforms to do more to protect young users. Some U.S. states have already begun imposing restrictions that require parental consent for teens to access social media.