Instagram is fundamentally changing the experience for its teenage users with the introduction of a new content moderation system modeled on the PG-13 film rating. This move from parent company Meta aims to create a more curated and protected environment for users under 18.
The most significant change is that this heightened level of protection will be the new default. Every teen account will automatically be placed in a “13+” setting. While opting out is possible, it will require a parent’s permission, effectively making it a joint decision between the teen and their guardian.
The new “PG-13” filter will be more aggressive in what it restricts. It will go beyond existing rules to hide or deprioritize content featuring profanity, dangerous stunts that could inspire copycat behavior, and imagery that normalizes harmful activities. Furthermore, certain search terms will be blocked to prevent discovery of inappropriate content.
The context for this change is crucial. It comes after a report from critics, including a former Meta whistleblower, found that the platform’s safety tools were largely failing. While Meta refutes the findings, the PG-13 system is a tangible response to the growing chorus of voices demanding better protection for young users.
The updates will be implemented first in the US, UK, Australia, and Canada, with a global rollout to follow. Safety advocates, however, are adopting a “wait-and-see” approach, arguing that Meta’s announcements need to be backed by verifiable proof that the new measures are making a real difference.