THE STATE OF REDDIT
The week of November 2, 2014

Is Reddit broken beyond repair?

By Aaron Sankin

One week after it inadvertently became the focal point for the most high-profile Internet story of the year, Reddit elected to willingly shut off the biggest single firehose of traffic the site had seen in recent memory.

In late August, a trove of celebrity nude photos were released on the anarchic image-sharing board 4chan. For a variety of reasons, none of them especially comforting, the pictures quickly became the most sought-after images on the Internet—and most of them found their way to Reddit, specifically the subreddit r/TheFappening.

Interest in the subreddit was overwhelming, nearly crashing a site that regularly handles getting over 2 million unique pageviews per day. But the attention it brought to Reddit as an entity was largely negative. Critics slammed the site for promoting extremely sensitive stolen content. When it was revealed that some of the nude photos depicted subjects who were under the age of 18 at the time the shots were taken, r/TheFappening freaked out. Posters on the subreddit gave instructions on how to wipe evidence of the child porn off users’ hard drives, arguably aiding and abetting a crime.

Eventually, Reddit administrations finally gave in and banned r/TheFappening. The shutdown was accompanied by a widely criticized statement from company CEO Yishan Wong that led the Verge to label the site a “failed state.” Wong’s post, coupled with a number of other statements from Reddit employees, painted a picture of a company engulfed in a swirl of utter chaos—unable to effectively deal with the actions of its increasingly large and unpredictable userbase. Then, earlier this month, the site’s community manager and public face, Erik Martin, abruptly quit.

For those who have been watching the Reddit community closely of late, none of what happened with r/TheFappening should have come as a surprise. The posting of celebrities’ stolen amateur porn is only a symptom of a much larger and far more fundamental problem that Reddit has, thus far, been patently unwilling to deal with—a rapidly metastasizing culture of omnipresent harassment.

An open letter, ignored

Earlier this year, a group of Reddit moderators published an open letter to the site’s administrators demanding something be done regarding the influx of trolls flooding their communities with racist, sexist, and violent hate speech.

Moderators rest at the very core of Reddit’s structure. Any Reddit user can create a community, called a subreddit, into which others post content. Subreddit moderators oversee posts and comments in their specific forums. These moderators are all unpaid volunteers who can spend hours every day helping to organize their subreddits into places that foster healthy, productive discussions.

For those who have been watching the Reddit community closely of late, none of what happened with r/TheFappening should have come as a surprise.

The problem is that fostering those types of discussions is difficult when a seemingly endless torrent of users flood many subreddits—especially those dedicated to women, gay, and racial minority issues—with inflammatory, mean-spirited trolling.

R/BlackLadies, the subreddit that originated the open letter, regularly saw posts with inflammatory headlines like “Do you feel really dirty for being a black woman? Are you ashamed of being an entity that is fucked up and an evolutionary failure?” Likewise, r/Rape, a subreddit dedicated to helping survivors of sexual assault and their families, has been turned from a safe haven to a toxic environment. Moderators told The Kernel that one out of every five comments or posts were trolling attacks on rape victims, insisting they they were liars or were “just asking for it.” As a result, the moderators of r/Rape have been forced to go through every single piece of content posted to their subreddit and personally ensure that it meets their community standards.

“As someone who has been there, made powerless by someone taking advantage of their body without their consent, and does this because I want to protect these people, sometimes modding here makes me feel incredibly powerless,” an r/Rape moderator going by the handle scooooot explained, “and frankly I feel that is intentional.”

The sad part is that all of this harassment is perfectly fine with Reddit administrators.

The r/BlackLadies mods asked Reddit to look into an uptick in trolls that immediately preceded the posting of the open letter, and the response was basically a shrug. The only options were for the moderators to spend even more of their own personal time manually deleting the destructive comments, for free, or to take the subreddit into private mode where people without prior knowledge of the community would be unable to simply stumble into it—a process that basically negates the entire point of a site geared around discovery.

Moderators can ban users from their subreddits, but those users can easily just create another anonymous account and continue their harassment unabated. The much stronger solution is to have Reddit admins block an entire IP address from accessing the site, which makes it so a troll would have to take steps to hide their IP address by using a virtual private network (VPN) or to physically relocate to access the site. Either way, throwing up a roadblock to Reddit might be enough to deter trolls.

However, Reddit has been unwilling to consider such a measure.

Trolling isn’t just limited to Reddit’s public forums.

The open letter, cosigned by the moderators of nearly 50 subreddits, charged:

Since this community was created, individuals have been invading this space to post hateful, racist messages and links to racist content, which are visible until a moderator individually removes the content and manually bans the user account. All of these individuals are anonymous, many of them are on easily-created and disposable (throwaway) accounts, and they are relentless, coming in barrages. Hostile racist users are also anonymously ‘downvoting’ community members to discourage them from participating. reddit admins have explained to us that as long as users are not breaking sitewide rules, they will take no action.

The resulting situation is extremely damaging to our community members who have the misfortune of seeing this intentionally upsetting content, to other people who are interested in what black women have to say, as well as moderators, who are the only ones capable of removing content, and are thus required to view and evaluate every single post and comment. Moderators volunteer to protect the community, and the constant vigilance required to do so takes an unnecessary toll.

The moderators of r/BlackLadies insist that Reddit management has in no way responded to any of the issues raised in the open letter. The site’s administrators did not return numerous requests for comment by The Kernel.

Doxing as self-defense

Trolling isn’t just limited to Reddit’s public forums. Harassment often continues through Reddit’s private messaging system, with trolls abusing the function to deliver personal threats.

“We had a survivor who was being constantly harassed by someone threatening to rape and possibly murder her,” recalled r/Rape moderator u/waitwhatnow, adding that she notified Reddit admins but did not receive a response. “I had about 4 or 5 users since then reporting similar problems, with screenshots, and I just had to tell them to block the person and that I’d ban the perpetrator from r/Rape, but that was the best I could do.”

Waitwhatnow added that her current account is actually her second Reddit identity. She had to give up her first after another user revealed her personal information without her permission, a process called doxing. “The person who did that threatened to kill my cat,” she said. “I also reported this to the admins and nothing was done.”

Without the Reddit admins taking an active role in site governance, it’s created an atmosphere where the vast majority users feel like they can operate with total impunity. The worst that could happen is posting a comment that gets downvoted into oblivion. The people who started r/TheFappening thought they would be able to create a forum to trade stolen celebrity nude pictures without any negative consequences from Reddit itself.

Reddit’s management created a leadership vacuum and, without any other guidance, some users took it upon themselves to fill it.

In the absence of leadership from the top, other Reddit users decided to take matters into their own hands. Shortly before r/TheFappening was banned, a Daily Dot reporter received a trove of personal information about the people who were running it—their names, photographs, employment histories, social media profiles, everything.

We decided to shelve the material. Violating someone’s privacy didn’t seem like the right way to respond to a group violating someone else’s privacy. The Washington Post ran an exposé on the man at the center of the subreddit, but refused to identify him. Later, the man revealed his identity to Wired as John Menese, a call center employee.

If nothing else, doxing on Reddit exists primarily as a mechanism to police what community members perceive as bad behavior in their midst. Someone believed that the mods of r/TheFappening were being jerks, but he didn’t think the site’s admins were doing enough to crack down on it. Reddit’s management created a leadership vacuum and, without any other guidance, some users took it upon themselves to fill it.

The solution?

In a sense, the problems facing Reddit are the same as those facing the Internet as a whole. People can be assholes, and when assholes interact with human beings, bad things happen. When it comes to the entire Internet, there’s not much anyone can do—other than hope that some stranger won’t feel compelled to threaten to kill someone’s cat just because they disagree with something they posted on the Internet.

Don’t hold your breath on that one.

But Reddit is not the Internet. Reddit is its own little fiefdom. It’s free to establish comprehensive and binding rules and processes to make it function better for everyone.

Reddit could easily improve itself—but first, it has to actually want to change.

The most obvious suggestion would be for the company to hire more community managers to deal harshly with any direct threats against users, possibly by issuing IP bans. A no-tolerance policy on direct, personal harassment would likely go a long way in changing Reddit’s culture, and do so relatively quickly. Right now, it seems like no one is watching what happens in direct messages, and that there are no consequences for threatening others with sexual violence. That needs to change.

The site could also introduce a degree of transparency into the process through which it shadowbans users. (Shadowbanning is a setting that Reddit admins can apply to someone’s account that makes everything appear normal from the user’s perspective, but hides all of that person’s actions from other users.) Reddit could take a page from the book of prominent esports forum Team Liquid and centrally list every banned user along with the reason for his or her ban. Putting some sunlight into an opaque process would prevent users from assuming that every shadowban is political. (Recently, redditors briefly got up in arms when they incorrectly believed administrators had banned a user for asking WikiLeaks mastermind Julian Assange about perceived censorship on the site.)

Reddit could easily improve itself—but first it has to actually want to change.

Another suggestion is to allow individual subreddits to opt into an approved commenter system. It could be similar to how Jezebel solved its ‟rape GIF problem.” Under this system, the moderators of a subreddit could make it so only the comments and posts of approved commenters would be visible to everyone. To view content from unapproved users, visitors to the subreddit would have to click a button warning them that they may be about to venture into rough territory. A moderated commenter system would allow Reddit to adhere to its ideal of not censoring speech based on its content while letting the users of subreddits vulnerable to trolling easily ignore the trolls.

Whether or not this system will work is debatable, but the specifics of whatever solution proposed here really don’t matter. What matters is that it’s an attempt to deal with Reddit’s biggest problem—that the site’s management seems either unable or unwilling to come to terms with the actions of its users. If a workable solution does come, it’s going to come out of Reddit itself, because it’s the people who have built the site’s physical architecture who know best about what’s doable and what isn’t.

Until that happens, Reddit users affected by the litany of problems have only a few options before them. They can continue to beg Reddit to do something, which will likely be futile. They can boycott Reddit’s advertisers or write letters to the company’s majority shareholder, Advance Publications, and ask if one of the largest media companies in the world is OK with a firm ostensibly under its umbrella ignoring rape threats en masse.

There is, however, one more option: They can just leave. There is nothing special about Reddit; the site’s power is its ability to attract a critical mass of users. By ignoring the flood of harassment, Reddit’s management has put the idea into many a head in recent weeks.

The top comments on the open letter post pretty much say it all:

Screen_shot_2014-09-09_at_3.47.54_PM

 

Illustration by Jason Reed