Comments

You must log in or register to comment.

jampapi t1_ja4014h wrote

This substantially fucked up my vibe just the other day!

4

QuestionableAI t1_ja41z9w wrote

And does Instagram even fucking care?

Corporate Overlords because the fucking government cannot be fucking bothered to do anything but obstruct with all the fucking republicans in office.

One last time, fuck'em with a spoon.

16

ex_banker t1_ja427i9 wrote

Meanwhile I spend hours searching for this shit and never find snytging remotelt interesting

−26

gullydowny t1_ja42lan wrote

LOS ANGELES — Kristoffer Reinman, a 32-year-old music producer and investor, was scrolling through Instagram last fall when he began to encounter violent videos — videos of people being shot and mutilated, posted by accounts he said he doesn’t follow.

“It was gory stuff, torture videos, stuff you just don’t want to see,” Reinman said. “Violent videos, they just started showing up. I was like, what is this? It’s nothing that I follow myself.” Feeling disturbed and disgusted, he immediately logged onto chat app Discord to tell his friends what was happening.

His friends replied that it wasn’t just him. They too were receiving violent videos in their feed. Twitter users also began posting about the phenomenon. “Hey @instagram,” one Twitter user posted in September, “why was the first thing on my feed today a beheading video from an account i don’t even follow? Thx!” Mitchell, an Instagram user in his early 20s who asked to be referred to solely by his first name because of security concerns, said that “It started with a video of a car crash, or an animal getting hit by a train. I just scrolled past it. Then I started to see people get shot.”

Since Instagram launched Reels, the platform’s TikTok competitor, in 2020, it has taken aggressive steps to grow the feature. It rewarded accounts that posted Reels videos with increased views and began paying monthly bonuses to creators whose Reels content performed well on the app.

Instagram also announced last year it would be leaning harder into algorithmic recommendation of content. On Meta’s second-quarter earnings call, CEO Mark Zuckerberg noted that Reels videos accounted for 20 percent of the time people spent on Instagram, saying that Reels engagement was “growing quickly” and that the company saw a 30 percent increase in the amount of time people spent engaging with Reels.

But at least part of that engagement has come from the kinds of videos Reinman and other users have raised concerns about, a result that shows how Meta’s Instagram has failed to contain harmful content on its platform as it seeks to regain audience lost to TikTok.

Meta acknowledged the existence of the violent videos but a spokesperson said they were a small percentage of the platform’s total content. According to the company’s most recent community standards enforcement report, for every 10,000 content views, an estimate of about three contain graphic violence, an increase from the previous quarter.

The spokesperson said Meta was conducting a review of the content in question, adding that the platform removes millions of offensive videos and takes other steps to try to limit who can see them. “This content is not eligible to be recommended and we remove content that breaks our rules,” the spokesperson said in a statement. “This is an adversarial space so we’re always proactively monitoring and improving how we prevent bad actors from using new tactics to avoid detection and evade our enforcement.”

Meme pages are some of Instagram’s most popular destinations, amassing millions of followers by posting videos, photos and memes designed to make viewers laugh or feel a connection. They account for tens of millions of Instagram followers, and their audiences often skew very young — according to a survey from marketing firm YPulse, 43 percent of 13- to 17-year-olds follow a meme account, an age group whose safety online is one of the few things Democrats and Republicans in Congress agree on. To add to the concern, the majority of people running the accounts are young, often teenagers themselves, those in the meme community say.

While the majority of meme pages don’t engage in such tactics, a sprawling underbelly of accounts competing for views have begun posting increasingly violent content.

The videos are truly horrific. In one video, a bloody pig is fed into a meat grinder. It amassed over 223,000 views. Other Reels videos that amassed tens of thousands of views show a woman about to be beheaded with a knife, a man being strung up in a basement and tortured, a woman being sexually assaulted. Several videos show men getting run over by cars and trains, and dozens show people getting shot. Other Reels videos contain footage of animals being shot, beaten and dismembered.

“#WATCH: 16-year-old girl beaten and burned to death by vigilante mob” the caption on one video reads, showing a bloody young woman being beaten and burned alive. The video was shared to an Instagram meme page with over 567,000 followers.

One day last week, four large meme pages, two with over 1 million followers, posted a video of a young child being shot in the head. The video amassed over 83,000 views in under three hours on just one of those pages (the analytics for the other three pages weren’t available). “Opened Insta up and boom first post wtf,” one user commented.

Large meme accounts post the graphic content to Reels in an effort to boost engagement, meme administrators and marketers said. They then monetize that engagement by selling sponsored posts, primarily to agencies that promote OnlyFans models. The higher a meme page’s engagement rate, the more it can charge for such posts. These efforts have escalated in recent months as marketers pour more money into meme pages in an effort to reach a young, highly engaged audience of teenagers, marketers said.

Sarah Roberts, an assistant professor at University of California, Los Angeles, specializing in social media and content moderation, said that while what the meme accounts are doing is unethical, ultimately Instagram has created this environment and must shoulder the blame for facilitating a toxic ecosystem.

“The buck has to stop with Instagram and Meta,” she said, referring to Instagram’s parent company. “Of course, the meme accounts are culpable, but what’s fundamentally culpable is an ecosystem that provides such fertile ground for these metrics to have such intrinsic economic value. … [W]ithout Instagram providing the framework, it wouldn’t enter into someone’s mind, ‘let’s put a rape video up because it boosts engagement.’ They’re willing to do anything to boost those numbers, and that should disturb everyone.”

Some meme pages create original content, but many primarily republish media from around the web. Meme pages like @thefatjewish and an account whose name is too profane to print were some of the most powerful early influencers on Instagram, building huge marketing businesses around their millions of followers.

In recent years, some successful meme pages have expanded to become media empires. IMGN Media, which operates several popular Instagram meme pages including @Daquan, which has over 16.3 million followers, raised $6 million in funding in 2018 to grow its business before being acquired by Warner Music Group in 2020 for just under $100 million. Doing Things Media, which owns a slate of viral meme pages, raised $21.5 million in venture capital funding earlier this year. None of these companies or the accounts they manage have posted violent videos of the nature discussed here.

39

gullydowny t1_ja42mp7 wrote

More children are seeking to leverage the internet early for financial and social gain, so many meme account administrators are young. George Locke, 20, a college student who began running meme accounts at age 13, the youngest age at which Instagram permits a user to have an account, said he has never posted gore, but has seen many other young people turn to those methods.

“I’d say over 70 percent of meme accounts are [run by kids] under the age of 18,” he said. “Usually when you start a meme account, you’re in middle school, maybe a freshman in high school. That’s the main demographic for meme pages, those younger teens. It’s super easy to get into, especially with the culture right now where it’s the grind and clout culture. There’s YouTube tutorials on it.”

Meta says it puts warning screens and age restrictions on disturbing content. “I don’t think there’s a world where all [meme pages and their followers] are 18-year-olds,” Locke said.

Jackson Weimer, 24, a meme creator in New York, said he began to notice more graphic content on meme pages last year, when Instagram began to push Reels content heavily in his Instagram feed. At first, meme pages were posting sexually explicit videos, he said. Then the videos became darker.

“Originally, these pages would use sexual content to grow,” he said, “but they soon transitioned to use gore content to grow their accounts even quicker. These gore Reels have very high engagement, there’s a lot of people commenting.”

Commenting on an Instagram video generates engagement. “People die on my page,” one user commented on a video posted by a meme page of a man and a woman simulating sex, hoping to draw viewers. Other comments below graphic videos promoted child porn groups on the messaging app Telegram.

In 2021, Weimer and 40 other meme creators reached out to the platform to complain about sexually explicit videos shared by meme pages, warning the platform that pages were posting increasingly violative content. “I am a little worried that some of your co-workers at Instagram aren’t fully grasping how huge and widespread of an issue this is,” Weimer said in an email to a representative from the company, which he shared with The Post.

Instagram declined to meet with the creators about their concerns. The content shared by many large pages has only become more graphic and violent. “If I opened Instagram right now, and scrolled for five seconds there’s a 50 per cent chance I’ll see a gore post from a meme account,” Weimer said. “It’s beheadings, children getting run over by cars. Videos of the most terrible things on the internet are being used by Instagram accounts to grow an audience and monetize that audience.”

A Meta spokesperson said that, since 2021, the company has rolled out a suite of controls and safety features for sensitive content, including demoting posts that contain nudity and sexual themes.

The rise in gore on Instagram appears to be organized. In Telegram chats viewed by The Post, the administrators for large meme accounts traded explicit material and coordinated with advertisers seeking to run ads on the pages posting graphic content. “Buying ads from nature/gore pages only,” read a post from one advertiser. “Buying gore & model ads!!” said another post by a user with the name BUYING ADS (#1 buyer), adding a moneybag emoji.

In one Telegram group with 7,300 members, viewed by The Post, the administrators of Instagram meme pages with millions of followers shared violent videos with each other. “Five Sinola [Sinaloa] cartel sicarios [hired killers] are beheaded on camera,” one user posted including the beheading video. “ … Follow the IG,” and included a link to his Instagram page.

Sam Betesh, an influencer marketing consultant, said that the primary way these sorts of meme accounts monetize is by selling sponsored posts to OnlyFans marketing agencies which act as middlemen between meme pages and OnlyFans models, who generate revenue by posting pornographic content behind a paywall to subscribers. An OnlyFans representative declined to comment but noted that these agencies are not directly affiliated with OnlyFans.

Meme accounts are fertile ground for this type of advertising because of their often young male audience. OnlyFans models’ advertising options are limited on the broader web because of the sexual nature of their services. The higher the meme page’s engagement rate is, the more the page can charge the OnlyFans agencies for ads.

“The only place you can put one dollar in and get three dollars out is Instagram meme accounts,” Betesh said. “These agencies are buying so many meme account promos they’re not doing due diligence on all the accounts.”

OnlyFans models whose images were promoted in advertisements on meme pages said they were unaware that ads with their image were being promoted alongside violent content. Nick Almonte, who runs an OnlyFans management company, said that he does not purchase ads from any accounts that post gore, but he has seen gore videos pop up in his Instagram feed.

“We’ve had [OnlyFans] girls come to us and say ‘Hey, these guys are doing these absurd things to advertise me, I don’t want to be involved with the type of people they’re associated with,’” Almonte said. “This happens on a weekly basis.”

Meme accounts are potentially raking in millions by posting the violence, said Liz Hagelthorn, a meme creator who formerly ran the largest meme network on Instagram, consisting of 127 pages and a collective 300 million followers. Hagelthorn said none of her pages ever posted violence. But young, often teenage, meme account administrators see gore as a way to cash in, she said.

“With gore, the more extreme the content is, is what the algorithm is optimizing for,” she said. “Overall what you see is when people hate the content or disagree with the content they’re spending 8 to 10 percent longer on the post and it’s performing 8 to 10 percent better.”

Some pages posting graphic violence are making over $2 million a year, she estimated. “The meme industry is an extension of the advertising and influencer industry,” she said, “and it is a very lucrative industry. If you have a million followers, you make at a base $3,000 to $5,000 per post. Bigger meme pages can make millions a year.”

“This is organized,” said Weimer. “It’s not two people posting gore videos, it’s hundreds of people in group chats coordinating posting and account growth.”

The administrators for several accounts posting gore appear to be young men, which Hagelthorn said is expected because most meme administrators are in their teens or early 20s. “These meme page audiences are 13-to 17- year olds, so the people who run the page are young,” Hagelthorn said.

Roberts, the assistant professor at UCLA, said that she worries about the effect this content and ecosystem is having on young people’s notions of morality.

“It seems like we’re raising a generation of adolescent grifters who will grow up having a totally skewed relationship of how to be ethical and make a living at the same time,” she said. “This is not normal and it’s not okay for young people to be exposed to it, much less be profiting from it.”

19

whatweshouldcallyou t1_ja45rl0 wrote

>In recent years, some successful meme pages have expanded to become media empires. IMGN Media, which operates several popular Instagram meme pages including @Daquan, which has over 16.3 million followers, raised $6 million in funding in 2018 to grow its business before being acquired by Warner Music Group in 2020 for just under $100 million. Doing Things Media, which owns a slate of viral meme pages, raised $21.5 million in venture capital funding earlier this year. None of these companies or the accounts they manage have posted violent videos of the nature discussed here.

So basically article tries to conflate objectionable content, which is impossible to completely prevent from being uploaded, and general meme accounts, but at least thankfully ends with an acknowledgement that b has nothing to do with a.

Combine with the pointless fulminating of a useless academic who has no clue about the technology of which she writes and bingo, you have an article.

−17

Hilppari t1_ja46wll wrote

and this is why automated feeds are stupid. should just show stuff you manually follow like twitter used to.

342

CandidEstablishment0 t1_ja49c93 wrote

I can remember this happening years ago on Fb. Every day for like a week, it was just awful images of dead African children, dying children, and gruesome animal cruelty, such as a dog owner skinning the entire dog alive and putting it in a salt field, and another that cut both his dogs ears off and laid it in front of him while taking photos as he looked down at his own ears. Truly horrific stuff that messed me up. And that was nearly 10 years ago.

9

throwaway92715 t1_ja4a2r1 wrote

Sorry to everyone who's had to see horrible images... but I have to say it's kinda hilarious to watch Meta's flagship product malfunction in such a glorious way.

6

Columbus43219 t1_ja4aeji wrote

I just had to mute a reddit sub of exactly this. Video just started playing in my feed.

63

Personal_Problems_99 t1_ja4hujh wrote

Instagram users could probably think better with a little reality shoved in their face.

−13

krum t1_ja4jqv3 wrote

Yup one day a while back I was scrolling through my usual feed of sexy Asian women and boom beheading video. Horrific stuff.

1

Martholomeow t1_ja4l2ok wrote

hey i just thought of a great way to solve all the problems with instagram.

Don’t use instagram!

5

DefreShalloodner t1_ja4nb1o wrote

Non-cock-&-ball torture does not belong in the public sphere. It's simply inappropriate.

3

Ace_Ranger t1_ja4om0g wrote

It's not just instagram. I had to mute a few subreddits because of the content. My adult children have reported terrible shit on Tiktok too.

4

SkylorBeck t1_ja4or91 wrote

If only that actually worked. Youtube has this and it does basically nothing. Recently they even made it so you can't dismiss music videos from your feed. Yesterday, out of the first six videos on my feed, 4 were music playlists. It's nice that the feature is there but it really only removes that single post.

4

Quietech t1_ja4qisj wrote

One you're choosing and know what you're getting, the other is a bad surprise. It's like parents sending their kids to church to learn about the Bible, and how parental incest is ok because of Lot's daughters having sex with him. Context matters.

0

trc2017 t1_ja4t7xk wrote

I just deleted my Instagram a few days ago because I wouldn’t stop suggesting car crash videos to me. like brutal crashes that people probably died in. I kept hitting the “stop showing me this kind of content” but It just kept showing me that. So I deleted it.

2

stacecom t1_ja4vlu6 wrote

Back in the day, I remember when there weren't subreddits, just one feed. Then the subs came, and the concept of defauts and home feeds.

In this day and age, I can't imagine following /r/all -- I imagine it would be horrifying. I only see subs I subscribe to.

7

Ok-Gate6899 t1_ja4w1ih wrote

lol they got shocked by reality like babies, algorithm is broken it burst their bubble oh no

−16

Hunterdivision t1_ja4ydrd wrote

They don’t, if they did they would remove such content immediately, they definitely do have the resources to do it as big companies. It’s why IG, FB etc. have content moderators that work to not approve this types of posts, but their conditions aren’t very good from the documentaries that are made and there’s too less of them to remove this kind of content (and other more vile content).

Instagram unfortunately only has a one goal: making money and having engaged users with the platform and spending time on said platform. The fact that these were shown in the user’s algorithms even though they previously haven’t watched such content, simply because it was popular just goes to show where their interest lies despite instagram also having underaged/teenagers users exposed to such content.

2

RamsesA t1_ja4ypor wrote

I had this happen to me multiple times and have learned how up successfully fix it. Clicking "do not show me this" doesn't work, because you've already engaged with the content at that point.

The correct way to fix this is to only engage with positive content (e.g. pictures of nature, expensive cars, cute animals, etc). Engagement includes not just clicking, but also gazing. If you stop scrolling to look at it, Instagram counts that as engaging, and you'll see more of it.

This makes sense if you understand that Instagram's goal is to monopolize your attention. The fact that you're having a miserable time is not part of the model. Could this be considered bad design? Probably, but here we are.

52

SkylorBeck t1_ja528id wrote

Dude I seriously spend more time curating my feed than I do enjoying it. Even here on reddit I had to disable the setting that makes your home feed into the curated feed. I wish that we had more control over it. At least on twitter you can block entire keywords and topics. Although... twitter is known as one of the bigger echo chambers. God.

2

_Jam_Solo_ t1_ja54b5r wrote

Before the internet it worked a little differently. That's what record labels did, and tv promoted things, people went to live shows, and stuff like that. It's a new world now.

−2

_Jam_Solo_ t1_ja556z3 wrote

Not really. It really doesn't spread much at all, and who is even gonna see it in the first place?

Imagine some amazing artist or YouTuber, creates amazing content.

They post it, and they get 0 views. No hashtags or anything shows their content to anyone. How is word of mouth going to help that?

Although, granted, hashtags can be something you follow.

−6

TheAppleFallsUp t1_ja55h5l wrote

Eh. What do you expect? The stuff gets clicks, it works, the data shows it and the algorithm follows suit.

−1

UsernameJonesHere t1_ja55prb wrote

This point is brought up a lot and my simple counter to it is: I don't fucking care. If I want to use Instagram to just look at my friend's photos of a bag of rusty nails Instagram shouldn't care either. No one cared about how they'd find new content before all these social media companies began introducing these intrusive algorithms to keep us on their platforms for hours and hours. It's a completely made up problem that doesn't need to exist at all.

21

_Jam_Solo_ t1_ja56brr wrote

I get it you might not care, and that's fair. I think instagram was like that, but some social media should be geared towards a more global audience, and some should be more local.

Or instagram should allow the choice. I know a lot of people want to use instagram as a way of just connecting with Freund's and family, and they don't care about all the other shit. And that's fair.

But the artists need a way to connect with their fans.

−8

backroundagain t1_ja59mjc wrote

Does anyone remember when this happened on Facebook? About 9 years ago, I remember a ton of really disturbing content came and went out of no where.

2

Typical_Cat_9987 t1_ja5aczi wrote

Why do people still use any of Meta’s products knowing what they do with all of our information?

0

_Jam_Solo_ t1_ja5lkfm wrote

There are of course multiple social medias, but an artist isn't only valuable of they are the preference of the masses.

Many great artists are the preference of niche groups. The internet reaches far and wide and bring these artists, these masters of their craft, to their fans all over the world. Whereas if they only exist locally, they may only come across a handful of people that enjoy their work, and their talents will go unnoticed and people that would love it will never experience it.

0

kakapoopoopeepeeshir t1_ja5okaj wrote

I follow #hipflexorstrength on Instagram because I’ve really been trying to strengthen mine and I want to see cool exercises. The last few days I go into look at them when I have free time at work and there’s photos of girls with their asses hanging out in thongs and doing sexual thngs I’m and like why is this on this page!

1

MochiMochiMochi t1_ja5y6i3 wrote

>These efforts have escalated in recent months as marketers pour more
money into meme pages in an effort to reach a young, highly engaged
audience of teenagers

Teenagers. Why am I not surprised.

1

Zokrar t1_ja64y3r wrote

I think it's a bad take to compare a user's desires from a product to the goals of a CEO. The fact that Instagram is a $400 billion dollar company doesn't automatically mean that their algorithm is flawless or that it always serves the user's interests.

Consider the impact of algorithmic bias and how it can influence what content users see or don't see. Companies prioritize profits over user satisfaction or privacy, which is why it's important for users to be aware of how their data is being used and to demand transparency from these companies.

Users should have the right to use Instagram in the way that they choose, without being forced to see certain content or having their data manipulated for the benefit of a company's bottom line

9

happy_snowy_owl t1_ja6ahps wrote

>Instagram also announced last year it would be leaning harder into algorithmic recommendation of content. On Meta’s second-quarter earnings call, CEO Mark Zuckerberg noted that Reels videos accounted for 20 percent of the time people spent on Instagram, saying that Reels engagement was “growing quickly” and that the company saw a 30 percent increase in the amount of time people spent engaging with Reels.

No, Mr. Zuckerberg, it's not because people are actually interested in reels. It's because people go on instagram to see photos posted by their friends, and their feed gets spammed with reels that are loosely related to something else they searched for or clicked on in the internet.

Pro-tip: Just because I bought my kids sneakers last week doesn't mean I want to scroll through endless videos of strangers doing basketball tricks.

3

NormalDevice3462 t1_ja6g4rv wrote

There were subs that were banned for posting death videos and now after Russia attacked Ukraine, suddenly none of those videos that shows death or gore is removed. Reddit is no different from Twitter or Facebook that make money showing such videos.

4

Direct-Ad3796 t1_ja6ql7h wrote

I constantly report disturbing violence against animals and/or graphically sexual content on Facebook, as well as scams, and links to shady porn sites (which probably contain malware type ads). And my reports are almost always ignored. Maybe it really is about the money

2

Fpscharles t1_ja6rrnl wrote

Yeah, I just watch a vid of a guy run into a car while driving a motor cycle at high speed. I watch stuff like that on Reddit, but I know what I’m watching and it’s not at random.

1

MammothTankDriver t1_ja6wzuf wrote

I have never seen nor have been suggested gore videos on instragram

2

MonsterHunter6353 t1_ja6ynhj wrote

Any idea how to prevent the feed suggesting videos in completely different languages? I don't know any other languages aside from English but Instagram shows me a ton of memes in other languages and I can't understand any of them

2

potatodrinker t1_ja6zype wrote

Got served a vertical story of a dog getting whacked repeatedly with a baseball bat while chained to a railing. Fking horrible. Didnt even bother hunting for the report button, just closed off and tried to forget

2

Daedelous2k t1_ja75hh8 wrote

Sites like CJ and Ogrish are new coming to Instragram?

1

Own-Philosophy-5356 t1_ja7jkom wrote

Does anyone keep getting two midget girls dancing in a bikini for clout???

1

RabidWolf-1 t1_ja7lpxu wrote

This isn’t news literally every outlet has that problem. I saw taliban members beheading people on Facebook like ten years ago.

1

smorfer t1_ja7ss4n wrote

Instagram is not a necessary good for life, if you want to actually have an impact as a customer in capitalism, not using a service, when you disagree with its actions, is one of the ways of using your power in a market. There is no reason luxury goods like that should be handled in any other way, while necessary goods should be regulated to a certain degree, so that the customer has secured access to them

7

squirrelnuts46 t1_ja7xqet wrote

>The fact that you're having a miserable time is not part of the model

As long as you keep watching, they don't care. Capitalism is always about short-term gains, shoving long-term issues under the rug.

1

Spinanator t1_ja81m0s wrote

YEP! There was like a one month period where the first thing on my feed was some decapitation, drive by shooting, or other horrific shit like scat porn (pun somewhat intended), and it just started up on Instagram for me about a month ago, sometimes the same videos

1