Instagram said it will begin notifying parents when a teenager repeatedly searches for terms associated with suicide or self-harm within a short period.
The move is part of broader efforts by the platform to strengthen youth safety features and provide earlier intervention signals to families.
The announcement comes amid growing pressure on governments worldwide to consider stricter protections for minors online, including proposals to mirror Australia’s policy banning social media use for children under 16
United Kingdom said in January it was weighing tighter online protections for minors following Australia’s December move, while Spain, Greece, and Slovenia have recently indicated they are also exploring limits on access.
Instagram, owned by Meta Platforms Inc., said on Thursday it will begin notifying parents enrolled in its optional supervision feature if their children attempt to view content related to suicide or self-harm.
“These alerts build on our existing work to help protect teens from potentially harmful content on Instagram,” Instagram said in a statement. “We have strict policies against content that promotes or glorifies suicide or self-harm.”

