Social media companies agree to testing of effects on teen users’ mental health


This recording was made using enhanced software.

Summary

New teen safety testing

Three social media companies will soon be tested for how well they protect teens and their mental health.

Safe Online Standards

The Mental Health Coalition announced a new Safe Online Standards initiative on Monday, saying it will grade social media platforms on whether they mandate breaks, offer options to disable endless scrolling and more.

Ongoing lawsuits

The testing comes as social media platforms face growing scrutiny and lawsuits over transparency and harm to children and teens.


Full story

Three social media companies have agreed to be tested for how well they protect the mental health of teenage users. The companies — TikTok, Snap and Meta, which owns Facebook and Instagram — have all agreed to participate in new testing that the Mental Health Coalition announced Monday. 

The testing, part of a new Safe Online Standards initiative, will grade social media platforms on whether they mandate breaks, offer options to disable endless scrolling and other criteria.

QR code for SAN app download

Download the SAN app today to stay up-to-date with Unbiased. Straight Facts™.

Point phone camera here

Companies whose sites rate highly will receive a blue shield badge, while those that rate poorly will be branded as not able to block harmful content, The Washington Post reports.

“Standards and ratings are commonplace today, except in the online technology space,” said Dr. Dan Reidenberg, Founder and Director of Safe Online Standards at the Mental Health Coalition, a nonprofit founded by the designer Kenneth Cole. “The public and advertisers want to know what is a safer place for youth and young adults in their online activities, and now we can help them with that through the S.O.S. program.”

While the testing is not a replacement for protective legislation, Reidenberg told The Post that the initiative will be a helpful way for parents and teens to decide how to engage with certain apps. 

What the platforms are saying

Meta, Snap and TikTok all shared statements on their participation, expressing support for the Mental Health Coalition’s mission. 

“We are humbled to learn from the Safe Online Standards and the expertise of Dr. Reidenberg and the Mental Health Coalition as we continue our daily efforts to ensure teens have a safe experience online,” Suzy Loftus, Head of Trust & Safety at TikTok, said.

“We are committed to doing our part to carefully and responsibly address suicide and self-harm content on our platform and across the ecosystem, and to provide tools, resources, and materials to those who may be struggling,” Snap said in a statement.

Ongoing lawsuits

The testing comes as social media platforms face growing scrutiny and lawsuits over transparency and harm to children and teens. Opening arguments were heard Monday in a trial in Los Angeles against major social media platforms, with plaintiffs alleging the platforms were designed specifically to be addictive. 

Snap and TikTok were named in the lawsuit but settled out of court for undisclosed amounts. However, Meta and Google will defend their platforms before a jury, and their chief executives — Mark Zuckerberg and Neal Mohan, respectively — are expected to testify.

Congress is weighing bills designed to protect children online, something that’s become increasingly common across the globe. In Australia, the government implemented a social media ban for anyone under 16.

Similarly, French lawmakers approved a bill banning social media for children under 15. 

Tags: ,

SAN provides
Unbiased. Straight Facts.

Don’t just take our word for it.


Certified balanced reporting

According to media bias experts at AllSides

AllSides Certified Balanced May 2025

Transparent and credible

Awarded a perfect reliability rating from NewsGuard

100/100

Welcome back to trustworthy journalism.

Find out more

Why this story matters

Three major social media companies have agreed to independent testing of their teen safety features, marking a voluntary industry step toward transparency as platforms face mounting legal pressure and global regulatory action over youth mental health concerns.

Mental health standards

The Safe Online Standards initiative introduces the first formal rating system for social media platforms' youth protection features, potentially influencing how parents and teens choose which apps to use.

Platform accountability

Social media companies face simultaneous legal challenges and regulatory scrutiny worldwide, with trials alleging addictive design and governments implementing age restrictions to protect children online.

Voluntary compliance

TikTok, Snap and Meta's participation represents a self-regulatory approach to addressing youth safety concerns, though two companies already settled related lawsuits for undisclosed amounts.

SAN provides
Unbiased. Straight Facts.

Don’t just take our word for it.


Certified balanced reporting

According to media bias experts at AllSides

AllSides Certified Balanced May 2025

Transparent and credible

Awarded a perfect reliability rating from NewsGuard

100/100

Welcome back to trustworthy journalism.

Find out more

Daily Newsletter

Start your day with fact-based news

Start your day with fact-based news

Learn more about our emails. Unsubscribe anytime.