Skip to main content
U.S.

Social media companies team up to address self-harm content

Share

Meta, Snap, and TikTok have come together to start a new program to stop content featuring suicide or self-harm from spreading across social media platforms. The program, called Thrive, was created along with the Mental Health Coalition, a group of organizations that work to destigmatize those issues.

Through Thrive, Meta, Snap, and TikTok will be able to share what the companies call “signals” with each other about content concerning suicide or self-harm. If the content appears on multiple apps, similar actions and investigations can then be launched.

QR code for SAN app download

Download the SAN app today to stay up-to-date with Unbiased. Straight Facts™.

Point phone camera here

Meta, which owns Facebook and Instagram, said Thrive will serve as a database all social media companies can access.

The company says when content featuring suicide or self-harm is discovered, it will be removed and flagged in the Thrive database so other social media companies can act. Meta also made it clear the program will target content not users.

Social media has increasingly been linked to a spike in depression and suicidal behavior in kids and teens, and companies like Meta, Snap, and TikTok have been widely criticized for not doing more to moderate content on their platforms. All three companies have been sued by parents and communities who say the social media platforms led to suicide deaths.

Tags: , , , , , , , , , , , , ,

META, SNAP, AND TIKTOK HAVE COME TOGETHER TO START A NEW PROGRAM TO STOP CONTENT FEATURING SUICIDE OR SELF-HARM FROM SPREADING ACROSS SOCIAL EMDIA PLATFORMS.

THE PROGRAM – CALLED “THRIVE” – WAS CREATED ALONG WITH THE MENTAL HEALTH COALITION, A GROUP OF ORGANIZATIONS THAT WORK TO DE-STIGMATIZE THOSE ISSUES.

THROUGH THRIVE – META, SNAP, AND TIKTOK WILL BE ABLE TO SHARE WHAT THEY CALL “SIGNALS” WITH EACH OTHER… ABOUT CONTENT CONCERNING SUICIDE OR SELF-HARM – SO THEY CAN INVESTIGATE AND TAKE SIMILAR ACTIONS IF THE CONTENT APPEARS ON MULTIPLE APPS.

META – WHICH OWNS FACEBOOK AND INSTAGRAM – SAYS “THRIVE” WILL SERVE AS A DATABASE *ALL* SOCIAL MEDIA COMPANIES CAN ACCESS.

THE COMPANY SAYS WHEN CONTENT FEATURING SUICIDE OR SELF-HARM IS DISCOVERED – IT WILL BE REMOVED AND FLAGGED IN THE THRIVE DATABASE SO OTHER SOCIAL MEDIA COMPANIES CAN ACT.

META ALSO MADE IT CLEAR THE PROGRAM WILL TARGET CONTENT… *NOT* USERS. 

SOCIAL MEDIA HAS INCREASINGLY BEEN LINKED TO A SPIKE IN DEPRESSION AND SUICIDAL BEHAVIOR IN KIDS AND TEENS AND COMPANIES LIKE META, SNAP, AND TIKTOK HAVE BEEN WIDELY CRITICIZED FOR NOT DOING MORE TO MDOERATE THE CONTENT THEY SEE.

ALL THREE COMPANIES HAVE BEEN SUED BY PARENTS AND COMMUNITIES WHO SAY THE SOCIAL MEDIA PLATFORMS LED TO SUICIDE DEATHS.