I started studying trolls in the summer of 2008, just before I began my Ph.D. program at the University of Oregon. I was primarily focused on trolls on and around 4chan’s /b/ board, one of the Internet’s most active trolling hotspots. I was especially interested in trolls’ highly stylized language and behavior and the ways in which trolls would signal (and/or police) their mastery of trolling language, memes, and raids to other self-identifying trolls.
And what an identity it was. Known for their explicitly racist, explicitly sexist, and explicitly, well, explicit exploits, the trolls I studied reveled in mischief, vulgarity, and things that cannot be unseen. They also reveled in disrupting as many people’s days as possible. As a result, trolls were regarded by many as the primary deterrent to a kinder, more equitable Internet. The more colloquial explanation (or lamentation) was that trolls were why we can’t have nice things online (a play on “arguecat,” a meme featuring a hissing gray cat often posted to forum discussions that had spiraled out of control).
Initially, this seemed like a reasonable conclusion to draw. But as I continued researching, I came to realize that trolls actually weren’t the stumbling block to a kinder, gentler Internet. They might have been a stumbling block. But what the trolls were doing—reveling in racism and misogyny, exploiting tragedy for their own amusement, and generally policing against what assistant professor Ryan Milner describes as a “white male centrality,” the presumption that whiteness and maleness is the norm from which everything else necessarily deviates—were often simply more extreme versions of things people who aren’t trolls unthinkingly do all the time.
It certainly was easier to think of trolls on /b/ as monstrous outsiders. But they weren’t the only people engaging in problem behavior. The overlap between trolling and “normal” (or at least, socially acceptable) behaviors was the real reason we can’t have nice things online. This is the point to which the title of my book gestures: the fact that trolling behaviors are born of and embedded within existing—and at times even celebrated—cultural logics. These logics (which I discuss here and here) include media sensationalism predicated on spectacle and exploitation, the logics of “good” social media use, and the logic of masculine domination, in which female-gendered behaviors or rhetorical styles are pathologized and actively policed against.
Referring to nasty online behaviors as “trolling” frames online antagonism as a game only the aggressor can win.
In the intervening years, the term “troll” has come to subsume all kinds of antagonistic online behaviors, regardless of whether the participants would describe themselves as trolls. I am wary of this new framing (in my research I was exploring a very specific, subcultural sense of the term), and whenever possible avoid using the term as a behavioral catch-all. Instead, I prefer to describe online antagonism in terms of the impact it has on its targets. So, if someone is engaging in violently misogynistic behavior, I call them a violent misogynist, as “troll” implies a level of playfulness that tends to minimize their antagonistic behaviors, or at least establish a firewall between the embodied person and their digitally mediated actions. (“I’m not really a racist, I just play one on the Internet” doesn’t account for the fact that, regardless of what might be in someone’s heart, his or her actions have a real and demonstrable impact on those forced to read yet another racist statement online.)
Just as problematically, the “troll” framing—which is so often used with either the implied or explicit caveat “just trolling,” i.e., “not a big deal/stop being so sensitive/learn how to Internet”—also casts aspersions over those who do not want to constantly deal with identity-based antagonism online. In short, referring to nasty online behaviors as “trolling” frames online antagonism as a game only the aggressor can win, most apparent in the phrase “don’t feed the trolls” (which I critique here). In the process, use of trolling as a behavioral catchall privileges the aggressor’s needs and interests and right to free expression over those of the people they target. It’s the troll’s world in this model. Everyone else is just living in it. And that gives these “trolls” far more credit than they actually deserve.
Still, the basic argument—that the biggest jerks on the Internet are often surprisingly similar to people who would consider themselves pretty normal—holds just as strongly when referring to online aggression more broadly conceived.
Women are simply not as visible, or as valued, within the technology sector.
Consider the sustained, coordinated, and yes, violently misogynistic attacks against Zoe Quinn, Brianna Wu, and Anita Sarkeesian during the unmitigated shitstorm that was Gamergate. (Note that I am refusing to call these behaviors “trolling,” for precisely the reasons outlined above.) In these cases, the violent sexism exhibited by participants is easy to identify, and for non-MRA types anyway, easy to condemn.
If these were isolated events, if the technology and gaming industries were otherwise equitable spaces for women, then the conversation would be much simpler: Just ban the misogynists and move on. But as it stands, the kinds of behaviors on display during Gamergate represent the grotesque extreme of behaviors that too many women in the technology sector encounter on a daily basis while navigating what are often banal—or seem like they should be banal—circumstances: Claire Cain Miller describes how many women end up feeling pushed out of the tech space or are never welcomed to begin with. Shanley Kane discusses the disproportionate infrequency with which women in Silicon Valley are given promotions and raises. S.E. Smith chronicles the “casual misogyny” that plagues many tech startups. Annie Lowry echoes this point by highlighting the “soft sexism” that is so pervasive within the tech space and yet can be so difficult to prove (as Lowry notes of the discrimination Ellen Pao experienced while employed by venture-capital firm Kleiner, Perkins, Caufield & Byers).
Beyond having to navigate institutional or otherwise subtle sexism, women in tech also have to deal with behaviors that are just, in a word, gross. Consider what happened when MIT computer scientists Elena Glassman, Neha Narula, and Jean Yang started a reddit AMA (“Ask Me Anything”) to answer questions about programming, their academic research, and what it was like to work in a male-dominated field. In response, the women were asked to make sandwiches, were flirted with grotesquely, and were bombarded with demands to know why they had decided to disclose the fact that they were women—because what does gender have to do with anything—all of which resulted in what one reddit commenter described as “a parody of what it’s actually like to be a woman working in a STEM field.”
The line between deviance and what passes as the norm is often much fuzzier than we might like to believe
And this is just the tip of the iceberg; these are the stories that have been publicized. There are far more women whose stories and voices are not heard. Women are simply not as visible, or as valued, within the technology sector.
Some, particularly those guilty of, sympathetic to, or unconcerned by the kinds of behaviors described above, will likely rankle at the suggestion that what they do—or merely think is funny, not a big deal, etc.—is comparable to the kinds of violently misogynist behaviors so visible during Gamergate. And to an extent that’s true; generally speaking, individual instances of everyday sexism aren’t as frightening or traumatic as coordinated misogynist attacks (though it should be noted that women who have experienced extreme gender-based harassment or violence might respond more virulently to “mere” sexism, serving as a reminder that “mere” is often in the eye of the beholder).
But on both ends of the spectrum of sexism and misogyny, the operating assumption is that women’s thoughts, experiences, and basic sense of safety (or at least comfort) don’t matter, or simply matter less. It’s much easier to see why this is a problem when the behaviors in question are outrageous, coordinated, and extreme. But they are just as damaging when they are defended or minimized and in the process normalized as something women should expect—and therefore accept—by virtue of their gender.
It might not be the most intuitive lesson, but this is what we can learn from the more extreme forms of online antagonism, whatever we might be inclined to call it: The line between deviance and what passes as the norm is often much fuzzier than we might like to believe. And if we really want to do something about the worst perpetrators, we would be well-advised to consider just how deep the problem actually goes.
Whitney Phillips holds a Ph.D. in English (folklore structured emphasis with a digital culture focus) from the University of Oregon and an MFA in Creative Writing (fiction) from Emerson College. Her book, This is Why We Can’t Have Nice Things: Mapping the Relationship between Online Trolling and Mainstream Culture, was published by The MIT Press in March 2015. She is currently co-authoring a second book on ambiguous online play and occasionally tweets at @wphillips49.
Illustration by J. Longo