Meta aims to offer tools designed to protect young users online.
Scientists confirm: This is the most effective way to get your cat’s attention, according to new research
Elderly Couple Refuses Reserved Seats—Viral Train Standoff Sparks Fiery Debate on Courtesy
While Meta has previously introduced various initiatives aimed at safeguarding teenagers, the company has often been criticized for not doing enough. Recently, even Meta’s VR headsets have come under scrutiny, following allegations of deliberately censored data from a study.
In this context, last April, Meta announced the creation of accounts specifically designed for teenagers on Facebook and Messenger, following their earlier integration on Instagram in several English-speaking countries.
This move is intended to provide reassurance to parents and regulatory authorities by regulating how 13-17-year-olds use social media, building on earlier efforts that utilized AI to identify minors pretending to be adults.
Rolling out teen accounts globally after a test phase
Initially launched in the United States, Canada, Australia, and the United Kingdom, these teen-specific accounts are set to be made available to young users worldwide. Justifying this expansion, Meta claims that “hundreds of millions of young people” have already tested these accounts without any reported incidents.
Why You Should Never Reheat These Foods in the Microwave – The Hidden Dangers Experts Warn About
I tried the top 5 guard dogs—here’s what makes these breeds the ultimate protectors
According to Meta, these accounts will be automatically activated for users aged 13-17, and will include default restrictions, particularly for users under 16, who cannot change settings without parental consent.
A way to limit exposure and regulate usage?
Designed to minimize risks associated with social media use, particularly in response to claims of mental health issues caused by these platforms, these accounts also restrict access to content deemed inappropriate and limit interactions to already followed or connected individuals.
Key features of the teen accounts include reminders to exit the app after 60 minutes and limits on nighttime notifications, along with enhanced parental controls, although Meta has not provided further details on this aspect.
A message to families and regulators…
By implementing these measures across all its social networks, Meta seeks to align with child protection standards required by various countries. Last year, Australia had already made a firm stand, and Florida even announced a ban on social media use for those under 14.
Meta thus hopes to improve its image among parents and calm the criticisms from health and educational authorities, but it remains to be seen whether these new accounts will indeed win them over…
Similar Posts
- Instagram Uses AI to Spot Teens Posing as Adults: Protecting Young Users
- Teens Overusing AI Chatbots on Instagram? Here’s How to Stop It Instantly!
- Australia Orders Meta, TikTok, Snapchat to Ban Minors Before December 10: Compliance Urged
- Instagram Implements New Age Rating System to Safeguard Teens
- New York Sues Meta, TikTok, YouTube: Claims They’re Ruining Teen Mental Health

Samantha Klein is a seasoned tech journalist with a sharp focus on Apple and mobile ecosystems. With over a decade of experience, she brings insightful commentary and deep technical understanding to the fast-evolving world of consumer technology.