Indy100 Staff
Feb 27, 2023
content.jwplatform.com
Middle-school hustlers trying to juice Instagram’s algorithm are suspected to be behind a stomach-churning surge of gore videos on the social network, it has been claimed.
Instagram launches Reels in response to the rise of TikTok, and for a while, accounts used nudity to draw viewers and engagement.
But that attempt to game the algorithm has taken an altogether darker turn, with videos of gore, animal cruelty, and other horrors filling people’s feeds.
A report from Taylor Lorenz at the Washington Post explored what’s driving the problem, and how Meta was struggling to tackle the issue.
According to the Post, a video of a pig being fed into a meat grinder got 223,000 views, while a clip of a woman about to be beheaded with a knife gots tens of thousands of views.
People who rapidly build these meme pages can make big money by selling sponsored posts, often to agencies that promote OnlyFans models.
Sign up for our free Indy100 weekly newsletter
Meme accounts are super popular with young people, and 43 per cent of 13- to 17-year-olds follow one.
Figures in the meme community said those running the accounts are often teenagers themselves.
George Locke, 20, who runs meme accounts but has never posted gore, said: “I’d say over 70 percent of meme accounts are [run by kids] under the age of 18. Usually when you start a meme account, you’re in middle school, maybe a freshman in high school. That’s the main demographic for meme pages, those younger teens. It’s super easy to get into, especially with the culture right now where it’s the grind and clout culture. There’s YouTube tutorials on it.”
Sarah Roberts, an assistant professor at University of California, Los Angeles, said: “Of course, the meme accounts are culpable, but what’s fundamentally culpable is an ecosystem that provides such fertile ground for these metrics to have such intrinsic economic value.”
In a statement, a Meta spokesman said: “This content is not eligible to be recommended and we remove content that breaks our rules. This is an adversarial space so we’re always proactively monitoring and improving how we prevent bad actors from using new tactics to avoid detection and evade our enforcement.”
Have your say in our news democracy. Click the upvote icon at the top of the page to help raise this article through the indy100 rankings.
Top 100
The Conversation (0)
x