Meta teams up with Snap and TikTok to address self-harm content

Updated

Meta is teaming up with Snapchat and TikTok as part of a new initiative to prevent content featuring suicide or self-harm from spreading across the social media platforms, Meta said Thursday in a blog post.

The program, named Thrive, was created in conjunction with The Mental Health Coalition, a group of mental health organizations working to destigmatize these issues.

Through the Mental Health Coalition’s Thrive program, Meta, Snap and TikTok will be able to share signals about violating suicide or self-harm content with each other, so that they can investigate and take action if the same or similar content has been posted on their respective apps. A spokesperson for Meta described Thrive as a database that all participating companies will have access to.

Meta said it is using technology it created and uses in conjunction with the Tech Coalition’s Lantern program — an organization that aims to make technology safe for children and includes companies like Amazon, Apple, Google, Discord, OpenAI and more — to ensure that data shared in Thrive is done so securely.

The spokesperson for Meta said that when content featuring suicide or self-harm is discovered, it will be removed from the platform and then flagged in the Thrive database so other social media companies can act.

Meta’s blog post made clear that the program is intended to target content — not users.

“We’re prioritizing this content because of its propensity to spread across different platforms quickly. These initial signals represent content only, and will not include identifiable information about any accounts or individuals,” Antigone Davis, Meta’s global head of safety, wrote in the post.

When content featuring suicide or self-harm is identified on a Meta platform, it will be assigned a number known as a “hash,” according to the spokesperson. That hash can then be checked by the other social media companies so they can search for the same content and remove it if it exists on one of the participating platforms.

Social media platforms, including Meta, TikTok and Snapchat, have long been criticized for not doing more to moderate content that teens consume, including video and images of self-harm. All three platforms have been sued by parents and communities who said content on the platforms led to suicides. Additionally, in 2021, leaked internal research, known as the “Facebook Papers,” revealed Meta was aware that Instagram, which it owns, could have harmful effects on teen girls.

Daniel Weiss, chief advocacy officer of the nonprofit media and technology watchdog Common Sense Media, said the Thrive program, while well-intentioned, appears to be similar to safety measures companies have taken when under pressure "from lawmakers and advocates to make their products safer for kids."

"We are glad to see these companies working together to remove the types of content associated with self harm and suicide," Weiss said. "Without proper regulatory guardrails, the jury is out on whether this will have a significant impact on the harms that kids and teens are facing online.”

A study from the National Library of Medicine shows that a major uptick in minors using social media has led to an increase in depression and suicidal ideation in those groups. The study also suggests that young people who use self-harm are more active on social media.

Earlier this year, the company announced that it would begin removing and limiting sensitive “age-inappropriate” content from teenagers’ feed on its apps and that it had plans to hide search results and terms relating to suicide, self-harm and eating disorders for all users.

In its blog post Thursday, Meta said that it removed 12 million pieces of content featuring suicide and self-harm from Facebook and Instagram from April to June. While Meta said it still wants to facilitate important conversations around self-harm and suicide, it hopes Thrive will help to keep graphic content off participating platforms.

If you or someone you know is in crisis, call or text 988 to reach the Suicide and Crisis Lifeline or chat live at 988lifeline.org. You can also visit SpeakingOfSuicide.com/resources for additional support.

Advertisement