Skip to main content

Unbiased. Straight Facts.TM

Politics

Meta, facing heat from all sides, announces safety measures for minors

Jan 12

Share

Media Landscape

See who else is reporting on this story and which side of the political spectrum they lean. To read other sources, click on the plus signs below.

Learn more about this data

Left 21%

Center 67%

Right 12%

Bias Distribution Powered by Ground NewsTM

A bipartisan group of state attorneys general filed a joint lawsuit against Meta, alleging that Facebook’s parent company knowingly used addictive features in its apps, negatively impacting children’s health. Lawmakers, often in disagreement, united together to demand answers from Meta’s leadership on its impact on minors.

“Look, we have a tremendous amount of evidence and information that’s been developed that shows that Meta knowingly has designed its products in a way to maximize its ad revenue by addicting young teenagers onto its products,” District of Columbia Attorney General Brian Schwalb said.

“They hid from this committee and from all of Congress evidence of the harms that they knew was credible,” Connecticut Sen. Richard Blumenthal said.

“They are deliberately misleading parents about what’s on their platform,” Missouri Sen. Josh Hawley said. “They are deliberately misleading parents about the safety of their children online.”

Now, Meta announces plans to expand safety measures for children and teens on its social media platforms. The goal is to make it harder for young users to come across sensitive content.

The company will implement restrictive settings on the accounts of teens and children, preventing users from searching “sensitive topics” and prompting teens to update their privacy settings.

In a blog post, Meta said Facebook and Instagram will hide search results for content related to suicide, self-harm, eating disorders, and nudity. Teens can still make posts on these subjects but won’t see them in their feed or stories, even if shared by someone they follow.

Meta aims to automatically place all teens under the most restrictive content control setting. These changes follow a whistleblower’s testimony to a Senate panel in November, stating that Meta knew harmful content was present on its platforms and company executives were taking no action.

“As a parent, I took the work personally,” Arturo Bejar, a former Meta employee, said. “By the time I left in 2015, I thought the work was going in the right direction. A few years later, my 14-year-old daughter joined Instagram. She and her friends began having awful experiences, including repeated unwanted sexual advances and harassment. She reported the incidents to the company, and it did nothing.”

Meta states the new update should be complete in a couple of weeks, just in time for CEO Mark Zuckerberg’s child safety testimony on Capitol Hill.

Tags: , , , , , , , ,

[BRIAN SCHWALB / D.C. ATTORNEY GENERAL]
“look, we have a tremendous amount of evidence and information that’s been developed that shows that meta knowingly has designed its products in a way to maximize its ad revenue by addicting young teenagers onto its products.”

[SEN. RICHARD BLUMENTHAL / D-CT]

“they hid from this committee and from all of congress evidence of the harms that they knew was credible.”

[SEN. JOSH HAWLEY / R-MO]

“they are deliberately misleading parents about whats on their platform. they are deliberately misleading parents about the safety of their children online.”

[LAUREN TAYLOR]

META HAS BEEN ON THE RECEIVING END OF A LOT OF CRITICISM FOR NOT DOING ENOUGH TO PROTECT KIDS WHO MIGHT BE USING ITS PRODUCTS.

A BIPARTISAN GROUP OF STATE Attorneys general FILED A JOINT LAWSUIT LAST FALL SAYING FACEBOOK’S PARENT COMPANY KNOWINGLY USED ADDICTIVE FEATURES IN ITS APPS THAT HAVE NEGATIVE IMPACTS ON KIDS’ HEALTH.

LAWMAKERS WHO OFTEN CAN’T AGREE ON MUCH HAVE COME TOGETHER TO DEMAND ANSWERS FROM META’S LEADERSHIP ON THE COMPANY’S IMPACT ON MINORS.

NOW META IS SAYING IT WILL EXPAND SAFETY MEASURES FOR CHILDREN AND TEENS ON IT’S SOCIAL MEDIA PLATFORMS.

THE GOAL: TO MAKE IT HARDER FOR YOUNG USERS TO COME ACROSS SENSITIVE CONTENT.

THE ANNOUNCEMENT COMING THREE WEEKS BEFORE META CEO MARK ZUCKERBERG TESTIFIES IN THE SENATE ABOUT CHILD SAFETY.

THE COMPANY SAYS IT WILL PLACE RESTRICTIVE SETTINGS ON TEEN AND CHILD ACCOUNTS… by preventing USERS FROM SEARCHING “SENSITIVE TOPICS” AND PROMPTING TEENS TO UPDATE THEIR PRIVACY SETTINGS.

AND FACEBOOK AND INSTAGRAM WILL HIDE SEARCH RESULTS FOR CONTENT RELATED TO SUICIDE, SELF-HARM, EATING DISORDERS AND NUDITY.

TEENS WILL STILL BE ALLOWED TO MAKE POSTS ON THESE SUBJECTS – BUT WON’T SEE IT IN THEIR FEED OR ON STORIES – “EVEN IF IT’S SHARED BY SOMEONE THEY FOLLOW.”

META SAYS IT AIMS TO AUTOMATICALLY PLACE ALL TEENS UNDER THE MOST RESTRICTIVE CONTENT CONTROL SETTING.

THE CHANGES COME AFTER A META WHISTLEBLOWER TOLD A SENATE PANEL IN NOVEMBER THAT META KNEW CONTENT ON ITS PLATFORMS WAS HARMFUL TO YOUNG USERS – AND THAT COMPANY EXECS WERE DOING NOTHING ABOUT IT.

[ARTURO BEJAR / FORMER META ENGINEER]

“as a parent, i took the work personally. and i worked hard to help create a safer environment. by the time i left in 2015, i thought the work was going in the right direction. a few years later, my 14-year-old daughter joined instagram. she and her friends began having awful experiences including repeated un-wanted sexual advances, harassment. she reported the incidents to the company and it did nothing.”

[LAUREN TAYLOR]
META SAYS THE NEW UPDATE SHOULD BE COMPLETE IN A COUPLE OF WEEKS… JUST IN TIME FOR META’S CEO MARK ZUCKERBURG’S TESTIMONY ON CAPITOL HILL.