If you want to see the best of Reddit, go to r/WhitePolitics, r/faggots, or r/FatPussy. You will not find sickos and slurs there—not anymore, anyway. Instead, these subreddits are obsessed with, respectively: the color white, bundles of sticks, and rather large cats.
In the past few years, Reddit communities once devoted to outré subjects, everything from InfoWars’ Alex Jones to objectifying tall women, have seen themselves hijacked and replaced by tame, often literal-minded send-ups of their original purpose. It’s very funny, it’s very Reddit, and it’s all the doing of a band of counter-trolling moderators who take their duties very seriously. Their antics are easy to root for. What’s less understood is how these individuals operate—and at what cost. Trolling the trolls, it turns out, takes a toll.
First, consider classic trolling: the icky kind. It takes time and organization, and those are its practitioners’ biggest weaknesses. If you’re going to pop up like a hobgoblin sowing discord in comments sections across the internet, you can’t spend too many resources in any one place. Far-right trolls, in particular, are notoriously disorganized and prone to infighting. That means there are an awful lot of hateful but poorly moderated subreddits out there, ripe for the hijacking.
The work of converting those swamps into arable webland is hard—and often deeply personal. (Though some mods say it’s “just for shits and giggles.”) The first task is identifying hateful subreddits, tracked at places like r/AgainstHateSubs. Once you find one, lurk on the threads for a while, getting a sense of who’s there and what they’re saying. But stay quiet: You might share an identity with the group the sub is designed to attack, so it’s best not to attract attention. Then, you wait. Make sure the previous moderators have been asleep at the switch for 60 days or more and submit a request to administrators to be made the new mod. Voilà. The subreddit is yours.
Drewie, who has been a redditor for more than seven years and moderates more than 50 subreddits, has always kept an eye on subs that sting her most. She’s trans and knows the harms and humiliations the internet can bring. She has a string of successful hijackings to her name—communities like r/Trannys (now about car transmissions) and r/Dykes (you know, like water-retaining embankments) make up her kingdom. Her most recent conquest was r/AlexJones, but the one she remembers best is r/faggots. She’d been eyeing the community from a distance for a while, but then there was rumbling among Reddit’s moderators: The deposed alt-right troll king Milo Yiannopoulos had just requested control of the sub.
Controversy ensued, but with no opposition, the administrators would likely grant it to him. Drewie wouldn’t have it. Yiannopoulos is a Gamergate alum, expert at trolling the people she wanted to protect; r/faggots, she feared, would become another platform to rally his troops. She submitted a counter request, telling the admins she planned to make the hate sub into something fun and satirical. They gave her the keys instead.
Drewie’s vigilante tactics might be in the service of good, but they were pioneered by the very trolls she’s trying to fight.
Since harassment and bullying are verboten, Drewie promptly scrubbed r/faggots of hateful posts and posters, turning the subreddit into an empty space for her and her allies to fill in. She thinks she won over the admins—who are notorious for leaving moderators to scrap out problems on their own, no matter what kind of hate speech redditors are spewing—by making them laugh. For Drewie, though, replacing homophobia with piles of sticks is about more than the lulz. “I’m Asian, I have a middle-class job, I pass, but a lot of the people I know don’t,” she says. “I have to use that to do something. I can’t just sit down and say, ‘Oh, I’ve got it good.’ I have no choice.”
Not everyone sees it so unambiguously. Drewie’s vigilante tactics might be in the service of good, but they were pioneered by the very trolls she’s trying to fight. To Whitney Phillips, author of multiple books on online trolling, the anti-hate hijacking of subreddits most closely resembles the “RIP trolling” circa 2011. “If, for example, a woman had been murdered by her husband, the troll would make a seemingly legit RIP page designed to get the most responses and followers possible,” Phillips says. Then, once the honeypot had attracted enough flies, the troll was free to flip the tone of the page entirely, using it to mock the deceased and threaten those who sympathized with them.
Over time, this tone-flipping tactic became less randomly malicious and more targeted. Before being banned from Reddit, r/antifa wasn’t a stronghold of the controversial antifascist movement but one of alt-right trolls pretending to be antifa in order to make the movement look bad—and maybe entice some naifs to a fake protest or two. It’s also the same school of trolling that’s given us Russian operatives posing as black activists on Facebook.
Little about this would suggest that these trolling techniques would ever be used for social justice on Reddit. They rely, in their original conception, on mockery and exclusion, not acceptance and bonhomie. As Phillips puts it: “Why would any progressive aspire to something so regressive? It very quickly becomes a conversation about whether the master’s tools can ever be used to dismantle the master’s house.”
“Why would any progressive aspire to something so regressive? It very quickly becomes a conversation about whether the master’s tools can ever be used to dismantle the master’s house.”
Author Whitney Phillips
There are two ways to think about that. The first is, do the tools work? This master’s house is built from bricks of bigotry. According to Casey Fiesler, who studies online governance at the University of Colorado, ousting homophobes and white supremacists and purveyors of nonconsensual pornography from their strongholds is unlikely to change their minds—and some researchers worry that pushing these people into ever-more-fringe regions of the internet will only increase their extremism. Many of the hijacked communities have indeed regrouped elsewhere under a different spelling or a different name. However, you have to be deep in the world in order to find those new locales. Perhaps the casual searcher—the 14-year-old kid typing slurs into Reddit—won’t expend the effort.
The second part of the counter-trolling moral conundrum is more existential: What do these tools do to the person who wields them? “It’s exhausting!” Drewie says. “I would love not to do this. You get no benefit, you just lose time.” On top of a full-time job, she spends at least five hours a day scouring Reddit for hate and monitoring the subreddits she’s hijacked, and that doesn’t even include peeking between meetings, during lunch, or on her commute. She thinks of doxing and death threats—inevitable in this line of activism—as “the cost of doing business.” The alternative is worse. “I get through the day by telling myself it’s just internet trolls. But I’ve lost so many friends who can’t take the hate,” she says. “I don’t want to go to another memorial.”
At times, the goal of these moderators seems unattainable. Hate on the internet is a hydra: Cut off one source and three more rear up. Nazis still post; cyberbullying still happens. And because these mods prefer to stay anonymous, they’re rarely recognized for their work. (Most of the mods I reached out to never returned my request for an interview.)
Observers like Fiesler say the difference they make is subtler, incremental shifts in Reddit’s tone and ethos. “These people are signposting what it means to be Reddit,” Fiesler says. “It shows people the power you can have as a moderator to change the culture of a community. Even if you’re not taking over subreddits yourself, you might be emboldened to say, ‘Hey, maybe we should ban hate speech.'” These trolls won’t be able to dismantle the master’s house entirely, but maybe their efforts can clear space to lay new foundations.
More Great WIRED Stories
social experiment by Livio Acerbo #greengroundit #wired https://www.wired.com/story/reclaiming-hateful-subreddits