Meta has come out with several measures and tools to protect teens on Facebook and Instagram from online harm.
The company, in a statement, said it aims to "create safe, age-appropriate experiences for teens on Facebook and Instagram."
Meta said last year too it had introduced a few measures like restricting adults from messaging teens who are not connected to them or preventing them from seeing teens in their 'People You May Know' recommendations.
In addition to our existing measures, we are now testing ways to protect teens from messaging suspicious adults they are not connected to, the company said.
Meta cited an example of a suspicious account as one "that belongs to an adult that may have recently been blocked or reported by a young person."
As an extra layer of protection, we are also testing removing the message button on teens’ Instagram accounts when they are viewed by suspicious adults altogether, Meta said.
Meta said it developed a number of tools to help teens on these social media platforms let the company know if something makes them uncomfortable while using its apps. Notifications will be introduced to encourage teens to use these tools.
Teens have been asked to report accounts that they blocked and Meta will send them information on how to navigate inappropriate messages from adults.
From now on, everyone who is under the age of 16 (or under 18 in certain countries) will be defaulted into more private settings when they join Facebook. The company said it will encourage teens to choose more private settings for 1) Who can see their friends list, 2) Who can see the people, pages and lists they follow, 3) Who can see posts they are tagged in, 4) Reviewing posts they are tagged in before the post appears on their profile, and 5)Who is allowed to comment on their public posts.
"This move comes on the heels of us rolling out similar privacy defaults for teens on Instagram and aligns with our safety-by-design and ‘Best Interests of the Child’ framework," the company said.
The company is also working on steps to stop the spread of teens’ intimate images online. "The non-consensual sharing of intimate images can be extremely traumatic and we want to do all we can to discourage teens from sharing these images on our apps in the first place"
Meta said it is working with National Center for Missing and Exploited Children (NCMEC) to build a global platform for teens who are worried intimate images they created might be shared on public online platforms without their consent.