Among massive shifts in Meta’s policies, the company said on Thursday that it fixed an error that resulted in a flood of violent and graphic content on some users’ Instagram Reels pages.
“We have fixed an error that caused some users to see content in their Instagram Reels feed that should not have been recommended. We apologize for the mistake,” a Meta spokesperson told CNBC.
The aforementioned content, according to one Reddit user, was a stream of Reels videos “full of street fights, school shootings, murder, and gore accidents.” Another user on Reddit described a feed that switched from “planes, watches, miniature painting, and cats” to “body horror and videos with descriptions in Russian.”
In a different Reddit post about Meta’s apology, the comments section is flooded with posts blaming AI, Meta’s recent layoffs, and misguided policy changes. One user commented, “Okay Meta but I saw a guy get executed.” Another user said they were done with Instagram altogether after the platform’s error. “I’m still not going back after their ‘mistake’ decided to show me child p*rn as soon as I opened the app. Never again.”
While Meta allows some graphic content on its site, its guidelines say it protects users from prohibited content like “videos depicting dismemberment, visible innards or charred bodies” and “imagery depicting the suffering of humans and animals.”
This comes just a few weeks after Meta CEO Mark Zuckerberg eliminated fact-checkers in favor of community notes, lifted prohibitions on certain forms of hate speech, scrapped DEI initiatives, removed trans-inclusive features from its apps, and reinstated political content recommendations. As CNBC pointed out, in 2022 and 2023, Meta cut 21,000 employees, many of which were part of its civic integrity and trust and safety teams.
Click here to read more >> https://mashable.com/article/meta-instagram-reels-violence-porn-error