Starting Tuesday, February 11, Instagram, one of the most popular social platforms among teenagers, will begin applying new restrictions to accounts of users aged 13 to 17 in Brazil. Meta, the company behind the social network, introduced these changes with the aim of providing greater protection and control for young users, especially in light of growing concerns about the impact of social media on the safety, mental health, and well-being of children and teenagers. These changes are being implemented not only in Brazil but also in other countries, such as India, as part of Meta’s broader effort to improve the experience of younger users on the platform.
Automatic Protections for Minors
The major update introduced is the creation of the “teen mode,” which aims to increase the default protection barriers for teen accounts. Instagram will now automatically activate security and privacy settings that make the digital environment safer for this age group, limiting interactions with strangers and restricting harmful content. Aware of the need for more supervision and security, Meta decided to adopt this proactive approach after pressure from online safety experts, educators, and organizations fighting for children’s rights online.

One of the most significant measures is that, starting with this update, teen profiles will automatically be set to private, meaning their posts will no longer be visible to anyone on the platform. Only people the teen already follows will be able to see their posts and interact with them, and even among friends, the teen will have control over who can comment or react to their posts. If an unknown profile wishes to interact with their posts, the teen will need to manually approve that contact. This measure aims to reduce exposure to unwanted interactions and protect them from potential online predators.
This change is part of a broader set of new rules, including new control features that can be adjusted, but only with the consent of parents or guardians. Meta explained that for users under 16, parents will have full control over whether these settings can be altered. However, for teenagers aged 16 and older, the control will be slightly more flexible, though still focused on maintaining safety and a comfortable browsing experience.
Limitations on Messaging and Monitoring Sensitive Content
Another important change is the restriction on direct messages (DMs) that teens can receive. With the new mode activated, young users will only be able to receive messages from people they already follow, reducing the chance of contact with strangers and potentially people with inappropriate intentions. Additionally, Instagram will intensify the use of the “Hidden Words” feature, which filters and blocks offensive or harmful words, such as terms that incite violence or hate speech, providing an extra layer of protection.
In addition to message restrictions, Instagram has also made significant changes to content recommendations. Posts considered sensitive, such as those related to violence, psychological abuse, eating disorders, or even the promotion of invasive aesthetic procedures, will have their visibility reduced in the platform’s suggestions. The goal is to reduce the exposure of teenagers to content that could negatively affect their mental health or encourage harmful behavioral patterns.
Regarding the content displayed in the feed and “Explore” sections, Instagram will now allow teens to more precisely choose the types of topics they are interested in, such as fashion, music, sports, and other areas, so that these topics appear more frequently in their recommendations. This aims to create a more personalized experience, but without opening up space for content that may be considered harmful to young users’ development.
Time Management
One of the most relevant additions is the time management feature. Instagram will now encourage teenagers to limit the time they spend on the app, with notifications appearing after an hour of continuous browsing. This measure aims to help young users set usage limits and balance the time spent on social media with other important activities, such as studying, resting, and offline social interaction. With this feature, Meta aligns itself with other digital initiatives that aim to combat the harmful effects of excessive social media use, such as addiction and the impact on mental health.
Expanded Parental Control
In addition to the automatic restrictions, Meta has also introduced new tools for parents and guardians, giving them greater control over their teen’s digital experience. Parents will now have access to more detailed settings, including the ability to set a daily time limit for Instagram usage. This means parents can set a maximum amount of time their children can spend on the platform, helping to control excessive use of the social network.
Moreover, the new parental control tool allows guardians to set specific times when access to the app will be restricted, such as at night or during school hours. This functionality aims to ensure that teenagers are not distracted by Instagram during critical times, promoting healthier and more balanced habits. Another significant addition is that parents will now be able to see who their children have interacted with over the past seven days, including the direct messages sent and received, offering greater visibility of their online interactions and behavior on the platform.
One critical aspect of this update is the implementation of age verification measures, designed to prevent minors from bypassing the restrictions. Meta has developed new technologies that will allow for more effective age verification, making it harder for teens to create fake profiles to access age-inappropriate content. This may include additional verification of birthdate information during the registration process, preventing minors from using adult birthdates to bypass the system.
Meta emphasized that, while the platform relies on information provided by users themselves, it will now be more difficult for teens to lie about their age to create an adult account. This change aims to ensure that teens follow safety guidelines and help the platform identify those who should not have access to content or interactions intended for adults only.
These changes represent an important step for Meta in enhancing the protection of young users on its platforms. The impact of the new restrictions will be felt over time as parents become accustomed to using the new control tools, and teens adapt to these new rules. With growing concerns about the effects of social media on teenagers’ mental health, these actions may help foster a healthier and safer environment on the platform.
The future of social media and how tech companies handle the safety of young users remains a central topic of debate. However, the implementation of these changes strengthens Meta’s commitment to creating a safer and more responsible digital space for teens, helping parents and guardians supervise their children’s digital experiences more effectively and practically. The hope is that, over time, Instagram could serve as a model for how social media can be used responsibly, in a balanced and protected way.