upper waypoint
The 'Gamergate' controversy involved groups of men organizing online to severely harass women in the video game industry.  Teodros Hailye/KQED Science
The 'Gamergate' controversy involved groups of men organizing online to severely harass women in the video game industry.  (Teodros Hailye/KQED Science)

What It's Like to Be Targeted by an Online Mob

What It's Like to Be Targeted by an Online Mob

Save ArticleSave Article
Failed to save article

Please try again

Zoë Quinn is a video game developer who was one of several women in the industry targeted by online harassment campaigns using the hashtag #Gamergate. The harassment started after an ex-boyfriend published a disparaging blog post about Quinn, and as it spread online, her detractors multiplied, leading to the public posting of her address, the hacking of her internet accounts, and numerous rape and death threats. 

This edited excerpt is from Quinn's book "CRASH OVERRIDE: How Gamergate (Nearly) Destroyed My Life, and How We Can Win the Fight Against Online Hate," published in September 2017 by PublicAffairs, an imprint of the Hachette Book Group. 

As anyone who has ever expressed an opinion on the internet knows, it’s not all sunshine and rainbows. Every rose has its thorn, and every news story has its comments section. As the internet has graduated out of nerds’ basements and into the mainstream, its formerly separate communities have come in closer and closer contact. For years, the people who preferred hanging out in small subcultural message boards and interest-based communities stayed pretty isolated, but with the advent of social media, the people who wind up on "To Catch a Predator" now have accounts on Twitter, Facebook et al., alongside your sweet grandma — assuming your grandmother hasn’t been caught trying to lure kids into a van.

Even if you stick mainly to mainstream sites, you’ve probably seen glimpses of the internet’s underbelly in the notorious comments sections at the bottom of news articles. The article could be about a local man saving a box of kittens from a burning building, but no matter: The comments will accuse him of hating dogs, setting the building on fire in the first place, and secretly being Barack Obama’s Kenyan uncle.

You’ve probably wondered two things: Who are these people, and what the hell is going on here?

My teenaged obsession with shock sites like Rotten.com started a lifelong hobby of spelunking through the weird pockets of the internet. This exploration taught me a lot (and, uh . . . showed me a lot) and exposed the fact that internet culture is essentially a magnificent patchwork of specific subcultures — good, bad and strange as hell. For every harmless community of users into really specific sexual kinks, there is a place like Bareback Exchange, a forum for people who get off on transmitting STDs to as many people as possible, often without consent. For every community of angsty kids who pretend they are secretly vampires, there are seven different forums of white nationalists who sincerely believe that Jewish people are secretly vampires. For every geeky and silly toy collector’s community, there are forums full of dudes collecting upskirt photos of random women and girls who had no idea they were about to become porn.

Sponsored

Attempting to explain anonymous message-board culture to the uninitiated is a lot like trying to explain an inside joke — you can lay out the particulars, but it won’t carry the same weight or meaning. It’s complicated and difficult to parse, like most things about internet culture, but here’s a brief overview. Opened in 1999, a Japanese site called 2channel was the first board in this genre. An anonymous board where admins are virtually nonexistent, this site has enabled corporate whistleblowing as well as frank, open exchanges about taboo subjects like mental health and sexuality. Alongside these generally positive discussions, the boards are teeming with slander, hate speech, porn, nationalism, and general unchecked terribleness.

2channel’s American counterpart is 4chan, an image board launched by a 15-year-old boy in 2003. Fourteen years later, 4chan is a hugely influential force on the internet: “the ground zero of Western web culture,” as one journalist put it. Most of the memes you see on social media were invented there — everything from LOLCats to rickrolling. It’s also a breeding ground for not-so-cute things, including a hoax hashtag with the goal of getting young girls to #CutForBieber, campaigns to troll the social media of dead teenagers, and murderers occasionally posting pictures of their victims.

Reddit has an even larger version of this problem, in both size and scope. Calling itself the “front page of the internet” and clocking in at 36 million user accounts, Reddit allows anyone to create a “subreddit,” a discussion area dedicated to any subculture or interest on its site. This model has allowed mentally ill people to find community without stigma, locals to exchange highly specific information about what’s good in their neighborhoods, and even President Obama to hop online and answer readers’ questions. But, like 4chan, it’s also been a hotbed for communities founded on hatred.

Reddit isn’t as anonymous as 4chan — users must create accounts and can be banned — but the site was created with a sort of free-speech absolutism in mind. Before it was shut down, the subreddit /r/Jailbait was a board for sharing sexualized pictures of underaged girls, and “jailbait” was the second-most-popular search term leading people to Reddit. Reddit users voted Jailbait the Best Subreddit of 2008, with double the number of votes received by the runner-up. It took six years and multiple public scandals to finally close it. Reddit has only recently started banning other repugnant subreddits, including hateful and blatantly racist forums, though many of them live on and new ones spring up constantly.

Poorly moderated anonymous communities can have the capricious morality of any mob. In 2009, when two videos featuring the physical abuse of a domestic cat named Dusty by a person calling himself “Timmy” were posted on YouTube, the 4chan community tracked down the originator of the videos and passed his details on to the local police department. The suspect was arrested, and the cat was treated by a veterinarian and taken to a safe place. This kind of “internet detectivery” has been banned from many traditional online forums outside 4chan. It’s invasive, it’s sometimes used simply to intimidate or harass people, and the mob is often wrong, with very real consequences.

When you consider how a tendency for vigilante action might manifest itself in a community founded on hating people together, you can see how the results might turn scary. Stormfront, a message board for white supremacists, was founded by former Ku Klux Klan leader Don Black in 1995 and had more than 300,000 users as of May 2015. Calling it “the Web’s first and best-known hate site,” the Southern Poverty Law Center’s March 2014 intelligence report stated, “Stormfront users have been disproportionately responsible for some of the most lethal hate crimes and mass killings since the site was put up in 1995. In the past five years alone, Stormfront members have murdered close to 100 people.”

This escalation from hate speech to real action isn’t unique to Stormfront’s user base. Before embarking on a shooting spree that killed six and injured 14, Elliot Rodger posted a video on several internet forums dedicated to hating women, discussing the deeply misogynist and racist motives for his rampage. He namechecked one of these sites in his manifesto, saying he had discovered “a forum full of men who are starved of sex, just like me.” The forum had “confirmed many of the theories I had about how wicked and degenerate women really are.”

Zoe Quinn (right), with Anita Sarkeesian, also a #Gamergate  target, were guest speakers at the introduction of a UN report called 'Cyber Violence Against Women and Girls: A World-Wide Wake-Up Call.' (UN Women/Ryan Brown)

Bad Advice

There’s one piece of advice that most often gets passed around to anyone who experiences harassment or abuse on the internet: “Don’t feed the trolls.” This maxim is passed off as gospel and is applied across the board, whether you’re a kid getting into your very first Facebook argument or an experienced developer dealing with death threats.

This advice is wrong. Pretty much everything we’ve been told about dealing with online abuse is wrong, but the misconception that "trolls" will just go away if they’re ignored is possibly the most damaging.

This kind of behavior is not just about terrorizing you; it’s about control. It’s about making you want to disappear, instilling fear and limiting your possibilities. It’s about punishing you for stepping out of line. It’s about isolating and hurting you in specific ways to provoke a reaction.

In my case, it became obvious that my attackers’ dream was to get me to stop “feeding the trolls” and shut up. They didn’t want to tease me; they wanted me gone. There were countless forms of harassment — the same channels that I had used to talk with friends, grow my business, and share weird videos were now full of threats, slurs, and all manner of nastiness. It escalated to include sexual or violent images with my face Photoshopped into them. My inbox started to fill up with pictures of women being raped. As strangers stalked through everything I had said or done online since I was 12, looking for more ammo, lies and conspiracy theories about me snowballed into weirder and more extreme accusations. These would then be blasted out widely, and also directly to my colleagues in games. Anyone to whom I had public ties began to receive nude photos of me and pressure to publicly denounce me or become their next target.

Tools of Abuse

It’s one thing to have a single person going after you with all of the above tactics; it’s something else entirely when a community forms around doing so. The networked nature of the internet doesn’t just make it easier for stalkers to find you; it also makes it easier for them to find each other. These tools of abuse serve both as an attack in and of themselves and as a rallying cry. They’re meant to be shared.

When you’re the target of abuse like this, you’re basically screwed. Not only does the scope of abuse that you face increase exponentially with every single signal boost from a new member of the mob, but all of the good things about the web’s ability to bring people together are turned against you. The same techniques that people have used to organize important grassroots movements can be used by people trying to destroy someone.

Attacking you becomes a participatory game in which people try to one-up each other in terms of who can get to you the most. In my case, I was struck by how many of the threats or disgusting remarks sent my way were made so publicly, usually while tagging other people. The ones that were especially vicious were rewarded with likes, shares, and people joining in on the abuse.

This phenomenon is often referred to as “dogpiling.” The cool remix culture that facilitates the spread of fanart and memes suddenly becomes a powerful tool to hurt someone. Photos and videos of you are Photoshopped to label you a whore or to make you look uglier or fatter and then shared the same way cute pictures of cats are. Memes are easily co-opted by other people, who made reams of almost propaganda-like images with my face Photoshopped onto them. It wasn’t really about me anymore. The mob was engaging in a performative group activity.

This type of community building is quite deliberate and direct. As the 4chan threads kept growing in size and the mob gained momentum, I noticed that a chatroom had sprung up in the original posts. The chatroom participants worked as a team to try to discover personal information about everyone connected to me, referring to it as “digging” and sharing form letters and tactics on how to best alert anyone in my life that I was a horrible slut. They were highly organized, discussing how to divide their ranks into specialized groups: one dedicated to getting me in legal trouble, one dedicated to turning all of my friends against me, and another dedicated to pushing me to kill myself.

They shared elaborate fantasies about raping and murdering me, discussing the pros and cons of each. They talked about how to break into all of my accounts to try to find more ways to invade my privacy. They bragged about victories like flooding my game’s page with hatred and nude photos of me and went so far as to create guides to share tactics on how best to ruin my life. They even orchestrated plans to donate to various charities specifically to make themselves look like concerned citizens and not a mob of people trying to get me killed. They built friendships and bonded with each other by reinforcing their dedication to the righteous cause of taking me down, reminding themselves at every turn that they were the good guys.

A mob has more tools at its disposal than individual actors do. Popularity — the quantity of clicks or views on any given page — is tracked and exploited by algorithms online, and a mob is a critical mass. If thousands of people are linking to something about you, that will quickly become the first thing people see when they Google your name, regardless of whether it’s a fact-checked news article or a video about what a bitch you are. Many sites allow their user base to vote on what is good content and what’s garbage, and mobs manipulate these systems to their targets’ detriment. There are also services that direct people away from sketchy websites that contain viruses, and the mob had flooded such services with false reports to make my websites and social media accounts inaccessible. My cohorts and I call this “brigading” — when people manipulate online systems to force their target into silence or hurt the person. Mass false reporting is a common tool to try to make the legitimate sites belonging to targets of online abuse vanish, as many systems are automated to react to a large volume of reports. Law enforcement agencies and government bodies like the IRS have online reporting systems that can also be manipulated this way by a mob.

Sponsored

As a game designer, I can spot a game being played. And the more people who join in on the “fun,” the faster you become an abstract concept for your aggressors to hate. This might sound sort of comforting or like a way to defang the attacks, but in reality, it’s the opposite — this “game” is another way that you are dehumanized, and it makes it easier for a mob to grow its ranks and escalate its attacks. You’re just data, and data doesn’t bleed. You’re a symbol, and hating you can become part of someone’s identity, just as any other hobby might. Just as they would in a game, they are always trying to make their numbers go up. And plenty of the witch hunters advance from amateur to professional.

lower waypoint
next waypoint
California’s New 1600-Acre State Park Set to Open This SummerSame-Sex Couples Face Higher Climate Change Risks, New UCLA Study ShowsBay Area Cities Push to Legally Validate Polyamorous FamiliesHomeowners Insurance Market Stretched Even Thinner as 2 More Companies Leave CaliforniaHoping for a 2024 'Super Bloom'? Where to See Wildflowers in the Bay AreaWhat Is the 'Green Flash' at Sunset — and How Can You See It?These Face Mites Really Grow on YouSchizophrenia: What It's Like to Hear VoicesEver Wake Up Frozen in the Middle of the Night, With a Shadowy Figure in the Room?Everything You Never Wanted to Know About Snail Sex