Advertisment

Meta, Snap, TikTok join forces to combat suicide and self-harm content

Meta, Snapchat, and TikTok have teamed up under the 'Thrive' initiative to detect and remove suicide and self-harm-related content across platforms, Overseen by the Mental Health Coalition.

author-image
Social Samosa
New Update
thrive

Meta, Snapchat, and TikTok have launched a new initiative called ‘Thrive’, aimed at detecting and removing suicide and self-harm-related content. The initiative is overseen by the ‘Mental Health Coalition’ and focuses on reducing exposure to harmful content for ‘at-risk users’.

The three platforms will share data concerning content to enable faster, more effective cross-platform action. The data shared will be in the form of identifiable hashes (numerical representations of the content), allowing faster detection and removal across all platforms. No ‘personally identifiable information’ or details about accounts will be shared, only the content will be flagged.

The project focuses on ‘graphic imagery’ and material that may ‘encourage' suicide or self-harm, ensuring stricter and quicker enforcement across platforms. While all three platforms allow users to discuss mental health concerns, the Thrive initiative focuses on removing harmful content that violates platform policies.

By sharing hashes, the platforms can improve detection processes for harmful content, contributing to stronger enforcement databases and policies within each app.This data-sharing strategy aims to remove harmful content more quickly and efficiently.

Thrive builds on the cross-platform collaboration that major social networks have previously used to counteract influence operations, which aim to deceive users. Such collaboration greatly enhances response efforts, ensuring coordinated actions against harmful content.

Increased social media usage has been linked to higher rates of youth depression and self-harm.

Meta's Antigone Davis wrote in an official blog, "Between April and June this year, we took action on over 12 million pieces of suicide and self-harm content on Facebook and Instagram. While we allow people to discuss their experiences with suicide and self-harm – as long as it’s not graphic or promotional – this year we’ve taken important steps to make this content harder to find in Search and to hide it completely from teens, even if it’s shared by someone they follow."

The initiative could potentially serve as a model for broader collaboration among tech companies to improve online safety. By focusing on faster detection and removal of harmful content, Thrive aims to create a safer online environment for vulnerable users.

Mental Health TikTok SnapChat Social Media Platforms teen safety on social media Meta social media content