Meta Platforms announced on Thursday that it has resolved an error that caused violent and graphic videos to appear in Instagram users’ Reels feeds worldwide.
The company did not disclose the reason for the glitch or the number of affected users.
“We have fixed an error that caused some users to see content in their Instagram Reels feed that should not have been recommended. We apologize for the mistake,” a Meta spokesperson said.
Users had reported seeing inappropriate content despite enabling Instagram’s “sensitive content control” setting designed to filter such material.
Meta’s moderation policies have been under scrutiny after it discontinued its U.S. fact-checking program on Facebook, Instagram, and Threads last month. The company prohibits violent content but allows exceptions for videos addressing human rights abuses and conflicts.
Meta has increasingly relied on automated moderation tools, a trend expected to accelerate following its decision to move away from fact-checking in the U.S. The company has previously faced criticism for its content moderation practices, including the spread of violent material during the Myanmar genocide, promotion of harmful content to teens, and misinformation during the COVID-19 pandemic.