MENLO PARK, USA — In its most significant move to protect young users, Instagram announced new “teen account” settings on Tuesday, September 17, 2024.
These settings automatically shift millions of users under 18 to private accounts and impose stricter content restrictions.
The update is part of parent company Meta’s broader effort to address growing concerns about the platform’s impact on young people’s well-being.
The changes, which will roll out next week in select countries, including the United States, are designed to limit teens’ interactions on the platform and reduce their exposure to potentially harmful content.
The update applies to both new and existing accounts of users under 18, intending to enhance parental oversight and promote healthier social media habits.
A Response to Mounting Pressure
The new restrictions arrive nearly three years after the infamous “Facebook Papers” exposed internal concerns about the dangers Instagram posed to young users.
Despite previous safety measures, Meta has continued to face pressure from lawmakers, parents, and advocacy groups to do more to protect teens from the platform’s darker corners.
This pressure reached new heights last November when whistleblower Arturo Bejar testified before a Senate subcommittee, accusing Meta executives, including CEO Mark Zuckerberg, of ignoring warnings about the platform’s harmful effects on teens.
Court documents from ongoing lawsuits against Meta have also revealed allegations that Zuckerberg obstructed initiatives designed to improve teen safety, and that the company failed to act on accounts belonging to children under the age of 13.
Meta has also faced accusations of allowing child predators to exploit its platform.
At a Senate hearing in January, Zuckerberg publicly apologized to families whose children had been harmed by social media, acknowledging the need for improved safety measures.
What the New ‘Teen Accounts’ Settings Entail
The latest update automatically sets all teen accounts to private by default.
Teen users between the ages of 16 and 17 can manually change their settings, but younger users, aged 13 to 15, will require parental approval to make any changes.
The restrictions will also limit who can tag teens in posts or mention them in comments—limiting this ability to only users they follow.
In an effort to combat overuse, Instagram will now prompt teen users to take breaks after one hour of usage per day.
Additionally, the app will enter “sleep mode” between 10 p.m. and 7 a.m., silencing notifications and automatically responding to direct messages.
Parental oversight tools will also be expanded, allowing parents to monitor who their teen is messaging, set time limits for app use, and even block access to Instagram during certain hours, including overnight.
Teens will also experience stricter content filters.
The platform will restrict the visibility of sensitive material, including posts promoting cosmetic procedures or other age-inappropriate content, in the Explore page and Reels.
These changes build upon existing measures, such as Instagram’s “take a break” nudges and earlier attempts to filter harmful content, including posts about eating disorders.
Limitations and Future Challenges
While Meta’s new features represent a significant step toward addressing concerns about teen safety, critics argue that challenges remain.
One notable issue is the difficulty in verifying parental oversight. Meta currently does not formally verify whether an adult is the actual parent of a teen account holder, relying instead on indirect signals like the adult’s birthdate and the number of accounts they supervise.
This could allow non-parents, such as older friends or relatives, to take on a supervisory role, potentially undermining the intended protections.
Additionally, Meta has long struggled to prevent teens from lying about their age when creating new accounts to bypass restrictions.
The company is now incorporating artificial intelligence technology to identify accounts that may have falsely listed an adult birthdate, in an effort to catch teens who evade safety measures by misrepresenting their age.
A Global Rollout
Meta plans to roll out these changes in the U.S., U.K., Canada, and Australia over the next 60 days, before expanding the update to other countries in the coming months.
The new safety measures were developed in consultation with Meta’s Safety Advisory Council, which includes independent safety experts, youth advisors, and feedback from parents and teens.
With these changes, Instagram aims to address some of the most pressing concerns from parents and policymakers, hoping to create a safer environment for its youngest users.
However, as teens and their online habits evolve, Meta will continue to face scrutiny over whether its efforts to protect vulnerable users are sufficient.