
Meta apologizes after Instagram users subjected to graphic content
By Craig Nigrelli (Anchor/Reporter ), Brock Koller (Senior Producer), Jack Henry (Video Editor )
- Many Instagram users received unwanted violent or graphic content on their reels page. Some of the users realized that others had a similar experience when they communicated on other social media.
- Meta apologized for the mistake.
- The company says it has 15,000 reviewers and advanced technology to help detect, prioritize and remove disturbing content.
Full Story
Instagram users recently received more than they signed up for when logging into the app. Meta, which operates Instagram, admitted that an error subjected users to violent and graphic content on their reels page.
Media Landscape
See how news outlets across the political spectrum are covering this story. Learn moreBias Summary
- Meta apologized on Feb. 27 after an error exposed Instagram users to graphic and violent content on their Reels page.
- Instagram users reported seeing inappropriate content despite having the Sensitive Content Control set to its highest level.
- Meta's policy aims to remove violent content to protect users, but this error led to disturbing imagery being shown.
- A Meta spokesperson stated, "We apologize for the mistake," as reported by CNBC.
- Meta admitted an "error" caused Instagram users to view violent and pornographic content on their Reels pages, including school shootings and murders.
- The company apologized for the issue and stated that it has been fixed, though details were not provided on the cause of the error.
- Many users reported seeing extreme content despite having activated Sensitive Content Control, violating Meta's policies.
- Meta will remove graphic content from the platform and has implemented measures to protect underage accounts.
- Meta apologized for an "error" that caused Instagram to recommend disturbing and violent videos to users’ Reels feeds, including minors.
- Even after claiming the problem was fixed, a Wall Street Journal reporter still saw disturbing videos late Wednesday night.
- According to a company spokesperson, Meta removed over 10 million pieces of violent content from Instagram from July to September of last year.
- An Instagram spokesman stated, "We have fixed an error that caused some users to see content in their Instagram Reels feed that should not have been recommended."
Bias Comparison
Bias Distribution
Left
Right
Untracked Bias
How has Meta responded to the complaints?
Meta said it has fixed an error that caused some users to see explicit, unwanted content on their Instagram reels feed. The social media company apologized for the mistake.

Download the SAN app today to stay up-to-date with Unbiased. Straight Facts™.
Point phone camera here
Instagram users shared the news with each other on social media platforms about an uptick in violent content and recommendations dubbed “not safe for work.”
The explicit reels showed up, even though some users had the sensitive content control features set to the highest level.
What is Meta’s policy about graphic content?
According to Meta’s own policy, it works to protect users from disturbing images and removes especially graphic content. This content includes videos that show dismemberment, innards and charred bodies, as well as sadistic remarks centered around images that depict human or animal suffering.
Get up to speed on the stories leading the day every weekday morning. Sign up for the newsletter today!
Learn more about our emails. Unsubscribe anytime.
By entering your email, you agree to the Terms & Conditions and acknowledge the Privacy Policy.
What did Instagram users specifically see?
CNBC reported images that Instagram users unexpectedly received included dead people, graphic injuries and violent attacks.
Meta said it has a team of more than 15,000 reviewers and advanced technology to help detect, prioritize and remove disturbing content before it offends users.
The latest error comes less than two months after the company announced plans to improve its policies to promote free expression.
MANY INSTAGRAM USERS GOT MORE THAN THEY BARGAINED FOR RECENTLY WHEN THEY WENT ON THEIR **REELS** PAGE. META, WHICH OPERATES INSTAGRAM, ADMITS IT SUBJECTED THEM TO A SLEW OF VIOLENT AND GRAPHIC CONTENT THAT POPPED UP.
THE COMPANY SAYS IT HAS FIXED AN ERROR QUOTE “ THAT CAUSED SOME USERS TO SEE CONTENT IN THEIR INSTAGRAM REELS FEED THAT SHOULD NOT HAVE BEEN RECOMMENDED. WE APOLOGIZE FOR THE MISTAKE.”
INSTAGRAM USERS HAD SHARED THE NEWS WITH EACH OTHER ON SOCIAL MEDIA PLATFORMS ABOUT AN UPTICK IN VIOLENT CONTENT AND “NOT SAFE FOR WORK” RECOMMENDATIONS.
THE REELS SHOWED UP EVEN THOUGH SOME USERS HAD THEIR – SENSITIVE CONTENT CONTROL – ENABLED TO THE HIGHEST SETTING.
ACCORDING TO META’S OWN POLICY, IT WORKS TO PROTECT USERS FROM DISTURBING IMAGES AND REMOVES CONTENT THAT IS ESPECIALLY GRAPHIC,SUCH AS VIDEO THAT SHOWS DISMEMBERMENT, INNARDS AND CHARRED BODIES, AS WELL AS SADISITIC REMARKS CENTERED AROUND IMAGES THAT DEPICT HUMAN OR ANIMAL SUFFERING.
CNBC REPORTS THE IMAGES THAT INSTAGRAM USERS UNEXPECTEDLY RECEIVED INCLUDED DEAD PEOPLE, GRAPHIC INJURIES AND VIOLENT ATTACKS.
META SAYS IT HAS A TEAM OF MORE THAN 15,000 REVIEWERS AND ADVANCE TECHNOLOGY TO HELP DETECT, PRIORITIZE AND REMOVE DISTURBING CONTENT BEFORE IT OFFENDS USERS.
THIS LATEST ERROR COMES LESS THAN TWO MONTHS AFTER META ANNOUNCED PLANS TO UPDATE ITS POLICIES TO BETTER PROMOTE FREE EXPRESSION. THE COMPANY, WHICH ALSO OPERATES FACEBOOK, ANNOUNCED IT WOULD CHANGE THE WAY IT ENFORCES CONTENT RULES IN ORDER TO REDUCE MISTAKES THAT HAD LED TO CENSORSHIP OF SOME USERS. FOR MORE UNBIASED UPDATES, DOWNLOAD THE STRAIGHT ARROW NEWS APP
Media Landscape
See how news outlets across the political spectrum are covering this story. Learn moreBias Summary
- Meta apologized on Feb. 27 after an error exposed Instagram users to graphic and violent content on their Reels page.
- Instagram users reported seeing inappropriate content despite having the Sensitive Content Control set to its highest level.
- Meta's policy aims to remove violent content to protect users, but this error led to disturbing imagery being shown.
- A Meta spokesperson stated, "We apologize for the mistake," as reported by CNBC.
- Meta admitted an "error" caused Instagram users to view violent and pornographic content on their Reels pages, including school shootings and murders.
- The company apologized for the issue and stated that it has been fixed, though details were not provided on the cause of the error.
- Many users reported seeing extreme content despite having activated Sensitive Content Control, violating Meta's policies.
- Meta will remove graphic content from the platform and has implemented measures to protect underage accounts.
- Meta apologized for an "error" that caused Instagram to recommend disturbing and violent videos to users’ Reels feeds, including minors.
- Even after claiming the problem was fixed, a Wall Street Journal reporter still saw disturbing videos late Wednesday night.
- According to a company spokesperson, Meta removed over 10 million pieces of violent content from Instagram from July to September of last year.
- An Instagram spokesman stated, "We have fixed an error that caused some users to see content in their Instagram Reels feed that should not have been recommended."
Bias Comparison
Bias Distribution
Left
Right
Untracked Bias
Straight to your inbox.
By entering your email, you agree to the Terms & Conditions and acknowledge the Privacy Policy.
MOST POPULAR
-
Getty Images
Actor Gene Hackman, wife and dog found dead in Santa Fe home
Watch 0:445 hrs ago -
Getty Images
Trump administration plans to close more than 100 IRS offices: Report
Watch 1:2714 hrs ago -
Getty Images
Supreme Court pauses order for Trump admin to restore USAID funding
Watch 2:1516 hrs ago -
Getty Images
House Judiciary subpoenas FBI for info on Biden admin investigations
Read18 hrs ago