Meta Platforms said yesterday it would hide more content from teens on Instagram and Facebook, after regulators around the globe pressed the social media giant to protect children from harmful content on its apps.
All teens will now be placed into the most restrictive content control settings on the apps and additional search terms will be limited on Instagram, Meta said in a blogpost, according to a Reuters report.
The move will make it more difficult for teens to come across sensitive content such as suicide, self-harm and eating disorders when they use features like Search and Explore on Instagram, according to Meta.
The company said the measures, expected to roll out over the coming weeks, would help deliver a more “age-appropriate” experience.
Meta is under pressure both in the United States and Europe over allegations that its apps are addictive and have helped fuel a youth mental health crisis.
In Europe, the European Commission has sought information on how Meta protects children from illegal and harmful content.
Children have long been an appealing demographic for businesses, which hope to attract them as consumers at ages when they may be more impressionable and solidify brand loyalty.
For Meta, which has been in a fierce competition with TikTok for young users in the past few years, teens may help secure more advertisers, who hope children will keep buying their products as they grow up.
MIB extends by 4 weeks ban on news channels’ TRP by BARC India
Reliance eyes LEO satellite play to rival Starlink in India: ET report
FIFA offered $20mn for WC’26 broadcast rights for India market
IPL franchise Rajasthan Royals get new owners in Mittals, Poonawalla
Network18 tops counting day with 2M+ peak YouTube viewers
‘Matka King’ S2 announced after strong global response on Prime Video
Prime Video to integrate MX Player into unified streaming platform
Raghav Raj Kodesia joins Netflix to lead Original Films and Acquisitions
Amagi launches in-content ad marketplace to expand CTV advertising push 

