Meta is enhancing its protective measures for teenage users on Instagram and Facebook, aiming to shield them from unsolicited direct messages, the company announced on Thursday.
This initiative follows shortly after Meta, which also owns WhatsApp, committed to concealing more content from teenage users. This decision was influenced by regulatory pressures urging the globally dominant social media network to shield young users from potentially harmful content on its platforms.
The focus on increased safety measures intensified after a former Meta employee testified in the U.S. Senate, accusing the company of being aware of the harassment and other dangers faced by teens on its platforms but failing to adequately address these issues.
Under the new policies, Meta is setting up Instagram so that teenagers will automatically not receive direct messages from individuals they don’t follow or aren’t connected with. Additionally, teens will need parental permission to alter certain settings in the app.
For Messenger, Meta is implementing rules whereby accounts belonging to users younger than 16, or under 18 in certain countries, will only be able to receive messages from Facebook friends or contacts already known via their phone contacts.
Furthermore, Meta has stipulated that adults over the age of 19 will be restricted from sending messages to teenagers who do not follow them back.