Skip to content
brown spider on web
Photo by Oleg Didenko. The BFD.

Some years ago, I read an account of what it was like to be a Facebook moderator. This was, of course, back in the day when they spent less time policing WrongThink and actually tried, at least, to contain the sewer of graphic violence and child abuse. It was a sobering read about a deeply unpleasant job and not one that I’d envy anyone having to do.

As a content moderator for Facebook, it was literally Dublin resident Chris Gray’s job to watch child abuse, animal torture, and executions — disturbing imagery that left his mental health in shambles.

“You would wake up and you’re remembering the video of someone machine-gunning people in the Middle East somewhere,” he told The Guardian, “trying to think whether there was an ISIS flag, and so whether it should be marked as terrorism-related or not.”

Futurism

It’s no wonder that a group of former moderators launched a suit against Facebook, for giving them PTSD.

It wasn’t just Facebook. A former moderator at MySpace said that, “When I left, I didn’t shake anyone’s hand for three years. I’d seen what people do and how disgusting they are. I didn’t want to touch anyone. I was disgusted by humanity.”

But while politicians and media obsess with people saying unapproved things about the pandemic, or the government, the depravity hasn’t gone away. Unlike healthy food, crafts or girls with braided hair, the real darkness oozes on, practically undisturbed.

Now, the groomers have found a whole new playground in which to slither up to their prey.

Over the course of a 22-month study, a researcher has claimed that almost 2,000 “predatory users” on Twitch have targeted over 250,000 children and teens.

Bloomberg reported on the findings of a researcher who requested anonymity. They did so fearing “potential career repercussions from being associated with such a disturbing topic,” despite specializing in internet harassment and extremism. Nonetheless, the outlet cited the research conducted from October 2020 to August 2022, which utilized Twitch profile data, screenshots and videos.

They claim at least 1,976 “predatory users” had follower lists of at least 70% children or young teens. In turn, these users targeted a collective 279,016 children and teens. Their targets were reportedly asked or dared to perform handstands, dances, expose their chests, or perform outright explicit sexual acts […]

While Bloomberg admits there was potential for “some” predatory accounts to be children themselves, “the behavior exhibited by many members of this group follows typical grooming techniques.”

Much of this predatory behaviour goes on only because platforms allow it. Twitter failed for years to take action on well-known hashtags used to exchange child abuse material. Within days of Elon Musk’s takeover, most of the hashtags have been banned.

It doesn’t help, either, that platforms make themselves easily available to children. Even when, as is the case with Facebook parent company Meta, leaked internal research shows that they are well aware of the risk their product poses to children. Even knowing this, Meta continues to knowingly promote products like Instagram to children.

While Twitch users must be at least 13 years of age, several examples given by Bloomberg reference users as young as 8. At one time, the researcher noted 1,200 accounts belonging to children. After a Twitch update in March to make it easier to report underage streamers, 41% of these accounts were deleted. Even so, creating alternate accounts to evade bans are a reportedly simple matter […]

The outlet also notes Twitch’s ease of setting up a new account to livestream; also much easier for children than other platforms. For example, YouTube, Facebook, and TikTok all feature age, contact number, subscriber count, or wait period requirements.

In their defence, it must be admitted that playing an endless game of whack-a-mole can be a prohibitively Sisyphean task.

Another inhibiting factor is attempting to moderate 2.5 million hours of live daily content, as it happens. Twitch utilizes user reports and live moderation, but there is only so much that could offer. The Buffalo shooting in May 2022 was seen by 22 viewers on Twitch but halted after 25 minutes. Twitch states that livestream was pulled “less than two minutes after the violence started.”

Twitch claims to be working on ways to address the child grooming rampant on its platform.

A Twitch spokesperson […] declared there were “numerous additional updates in development” to find and ban both child streamers, and predators. They couldn’t reveal “much of the work in this area,” as it could result in methods to evade the unspecified plans being developed […]

Along with Twitch reportedly terminating three times more underage streamers than in 2021, Chief Product Officer Tom Verrilli stated user reports and “automated solution” are utilized to find users exhibiting the behavior of a child or a predator.

Bounding Into Comics

In the meantime, the worst people imaginable will continue to ruin everything. Because that’s what they do.

Latest