Instagram on Wednesday rolled out a series of updates aimed at making the platform safer for underage children by giving parents more support and resources on the app. 


What You Need To Know

  • Instagram on Wednesday rolled out a series of updates aimed at making the platform safer for underage children by giving parents more support and resources on the app

  • The updated supervision tools, which are currently active in the U.S. and will be rolled out globally over the coming months, can be found in the new Family Center portion of the app

  • The supervision controls are optional, and the teen user must consent to participate; the rules can be terminated by either child or parent at any time

  • Parents and guardians can now view how long their teen spends on Instagram and can set time limits for the app

The updated supervision tools, which are currently active in the U.S. and will be rolled out globally over the coming months, can be found in the new Family Center portion of the Instagram app. Parents of teens aged 13 - 17 can now view how much time their child spends on Instagram and can set time limits, can be notified when their child reports a user on the app and view and receive updates on what kind of content their teen is viewing on Instagram. 

The supervision controls are optional, and the teen user must consent to participate; the rules can be terminated by either child or parent at any time. The controls automatically expire when a user turns 18, per the birth date provided upon creation of their account.

“Teens will need to initiate supervision for now in the app on mobile devices, and we will add the option for parents to initiate supervision in the app and on desktop in June,” Adam Mosseri, head of Instagram, wrote in a blog post. “Teens will need to approve parental supervision if their parent or guardian requests it.”

The Family Center will also serve as an educational hub for parents and guardians to learn about the new tools, and will have videos and other resources to teach adults “how to talk to teens about social media.” 

Under Instagram’s user agreement, individuals under the age of 13 are not allowed to use the app. Meta Inc., Instagram’s parent company, has acknowledged children often gain access to its platforms by lying about their age – but says it removes accounts if it determines the users are underage. 

Wednesday’s announcement is “just one step on a longer path,” Mosseri added, as similar parental controls will be rolled out across all other Meta platforms.

Over the coming months, the company plans to roll out parental supervision tools on its virtual reality headset Quest. Beginning in April, parents will be able to prevent their children from buying games they deem inappropriate. By May, teens will be automatically blocked from downloading age-inappropriate content. 

Meta, and social media platforms at large, have faced mounting pressure from lawmakers and parents alike to promote safety on their sites. 

The U.S. Surgeon General in late 2021 released a rare public advisory pointing in part to media companies and their potential role in negatively impacting youth mental health, saying while some programs “can have a powerful impact on young people,” the onslaught of “false, misleading, or exaggerated media narratives can perpetuate misconceptions and stigma against people with mental health or substance use problems.” 

The report urged social media companies to take a number of steps to curb their negative impact on adolescent mental health – the first of which should be to provide the government with more accurate data of the behavioral impacts of time spent online, officials said in the report. 

Meta has updated user guidance a number of times since the report was published. In December, Instagram launched its “Take A Break" feature, which will notify teen users if they have been fixated on a particular subject for too long or have been using the app for an extended period of time.