In response to ongoing concerns about the safety of young users online, and following recent measures to identify minors posing as adults, Instagram is taking a new direction.

As recently announced on Meta’s website, the social network is set to undergo a significant overhaul of its moderation system for minors. Drawing inspiration from the American film rating system, this update will implement a so-called “PG-13” approach, with stricter content restrictions for teenagers.

A Film-Inspired Model to Screen Posts

In the 1960s, the American film industry developed its own classification system ranging from “G” to “R” to avoid government-imposed regulations.

This is the model that Meta now intends to apply to Instagram. Specifically, the accounts of teenagers aged 13 to 18 will automatically be set to a mode designed to mirror the PG-13 movie experience.

Meta has stated that posts featuring profanity, suggestive poses, risky challenges, or content related to alcohol and drugs will be hidden or not recommended. These restrictions will also extend to searches, and certain keywords linked to adult content will now be blocked.

Accounts that frequently share content deemed inappropriate will no longer be able to interact with teenagers, nor appear in their recommendations.

New Control Tools for Parents

Beyond automatic filters, Meta is introducing a setting for parents who want to enforce stricter controls. This feature will allow for further restrictions on visible posts and the ability to disable comments or certain interactions with the platform’s integrated AI systems.

According to Meta, this update is supported by feedback from thousands of parents, with 95% finding these new settings helpful and easier to understand.

In addition, parents will also be able to report posts they consider inappropriate, aiding Instagram’s teams in refining the classification system.

Protection Still Imperfect

This update comes amidst increasing pressure on Meta and its competitors, who are accused of contributing to mental health issues.

Unfortunately, this protection may still be inadequate. A recent survey by Ofcom found that 22% of 17-year-olds claimed to be 18 or older on social media platforms.

Meta also acknowledges the use of identity verification or video selfie checks in addition to other measures already in place and more that are expected to roll out by 2026…