Teen Instagram Account: A New Solution to Protect Young Users?
Meta is rolling out Teen Accounts on Instagram, which will limit how younger users engage on the platform. These accounts will automatically be assigned to users under 16.
Despite social media platforms being technically limited to those 13 and older, many younger users are able to create accounts anyway. Children under 13 are active on platforms like Instagram and TikTok, where they risk being exposed to harmful content and online predators. Many criminals use social media to target minors, and Instagram has been criticized for having a negative impact on young users' mental and physical health.
To address these issues, Meta has implemented various safeguards. For instance, minors cannot receive messages from users they don't follow. Now, Meta has introduced "Teen Accounts" to better protect younger users by limiting their interactions and usage on Instagram.
In the coming weeks, users aged 13 to 15 will receive notifications that their profiles are switching to "Teen" mode. This new account type aims to address parental concerns, such as the content teens are exposed to, who they interact with online, and the quality of time spent on the platform.
By default, teen accounts will be set to private, meaning users must approve new followers, and those who don't follow them won't be able to view or interact with their content. Teens can only be tagged or mentioned by accounts they follow, and the most restrictive anti-bullying settings will be enabled, automatically filtering out offensive language in comments and direct messages.
Sensitive content will be restricted, limiting its visibility in Explore and Reels. Additionally, a sleep mode will be activated from 10 p.m. to 7 a.m., silencing notifications. Teens will also receive daily reminders to log off after 60 minutes of use.
Parents can now better monitor and manage their teen's Instagram activity, including blocking the app if needed. Teens under 16 need parental approval to change protection settings. While parents can't view private messages, they can see who their teen has messaged in the past week and what content they've requested. Parents can also set daily usage limits and control usage times.
The rollout began on September 17, with full implementation expected within 60 days in the U.S., UK, Canada, and Australia, and globally by January 2025.
The main challenge remains age verification, as minors can easily lie. Instagram doesn't consistently check user ages at sign-up. Meta uses AI to estimate age through activity, like detecting birthday messages or linked Facebook accounts, but these measures can still be bypassed.