Home Lifestyle Instagram Tightens Teen Safety Rules: Livestreams and Nude Filters Now Require Parental OK

Instagram Tightens Teen Safety Rules: Livestreams and Nude Filters Now Require Parental OK

by Andrew Rogers
0 comments

Instagram is changing how teenagers use the app. Starting soon, teens under 16 won’t be able to go live or turn off safety filters without their parents’ approval. Meta, which owns Instagram, made the announcement on Tuesday. The new rules are part of Meta’s wider push to protect young users online. These changes will first roll out in the U.S., U.K., Canada, and Australia.

New Limits for Young Instagram Users

Meta said teens will need parental permission to use some features. The first is Instagram Live. Users under 16 will now need a parent’s approval before they can stream live videos.

The second feature involves nudity protection in messages. Instagram uses a filter that blurs images that may show nudity. Under the new rules, teens must get permission before turning that filter off.

Both changes are part of Meta’s Teen Accounts system. This system gives parents more tools to manage how their kids use Instagram.

Teen Accounts Expand to Facebook and Messenger

The Teen Accounts system began on Instagram in September 2023. Now, Meta says it will expand the system to Facebook and Messenger too.

So far, Meta reports that 54 million teen users worldwide have already been moved into Teen Accounts.

These updates are rolling out in four countries first — the U.S., U.K., Canada, and Australia — with other regions expected to follow.

Why Meta Is Making These Changes

These updates come as more people worry about how social media affects kids. Experts, parents, and lawmakers say apps like Instagram can harm mental health and expose kids to risky content.

Meta’s new features aim to reduce those risks. They also help parents stay informed and involved in their children’s online lives.

Other Safety Features for Teens

Instagram already has several tools to help protect young users. These include:

  • Private by default: Teen accounts are private when they’re created.

  • No stranger messages: Strangers can’t send private messages to teens.

  • Content limits: Teens don’t see as much sensitive or adult content.

  • Break reminders: Teens get alerts suggesting screen breaks.

  • Quiet hours: Notifications are paused during sleep times.

These tools are all part of Meta’s effort to make Instagram safer for its youngest users.

Laws Push Platforms to Protect Kids

Meta isn’t making these changes only because it wants to. Governments are also stepping in.

In the U.K., a new law called the Online Safety Act is taking effect. It requires big tech platforms to protect users — especially children — from illegal or harmful content.

In the U.S., lawmakers and child safety groups have called for similar rules. As a result, many tech companies are adding more controls for parents and limits for younger users.

Encryption Sparks Debate

At the same time, Meta is facing heat for how it handles privacy. The company plans to add end-to-end encryption to messages in Messenger and WhatsApp.

This feature keeps messages private — even from Meta itself. But critics worry that it also makes it harder to find and stop child abuse.

The NSPCC, a child safety group in the U.K., says Meta is ignoring the risks. The group accused the company of “turning a blind eye to crimes against children.”

Meta defends encryption. It says privacy is key to user safety and that it’s adding other protections, like scanning public posts and using artificial intelligence.

Parents and Experts Welcome More Controls

Many parents and experts have supported Meta’s new rules. Giving adults more control over livestreams and message filters helps reduce the chance of kids getting into trouble or seeing harmful content.

Dr. Amanda Gummer, a child psychologist, told the BBC that “simple tools like this help parents feel empowered and give children a safer experience online.”

Meta says it will keep working with child safety groups, lawmakers, and researchers to make its platforms safer for everyone.

What’s Next?

Meta hasn’t said exactly when the new rules will hit every user, but teens in the U.S., U.K., Canada, and Australia will see them first. Other countries will follow later.

As pressure grows, more tech platforms are likely to add limits for teens and tools for parents.

You may also like

Creaze News

About Us

Creaze News is a dynamic and innovative news platform committed to delivering timely, accurate, and engaging stories from around the world. Focused on breaking news, in-depth analysis, and thought-provoking insights, we keep readers informed and ahead of the curve

Latest Articles

© Creaze News – All Right reserved