Everybody forgets. What we don’t forget, we misremember. We distort and exaggerate. We hold isolated details close to the surface, obscuring all the rest. Sometimes we just lie.
The Internet does all this as well. We continue to deny it, perhaps because our parents warned us that the Web is written in ink, not pencil—as if “permanent” were a synonym for “incontrovertible.” In one sense, it is; reversing the impact of a humiliating viral post or negative article is nigh impossible, much the way your drunken outburst last Thanksgiving won’t be wiped from family lore. But despite privacy advocates’ ongoing battle for the so-called “right to be forgotten” in the U.K., the power to massage, trim, or revise the record already exists, albeit in shadowy, rarefied form—which makes it ripe for abuse.
Deciding whether to give Joe and Jane Mousepad a taste of that power, or shedding light on those who already wield it, feels vital at a time when Internet diets are becoming more homogenized, with content accordingly taken at face value. As Facebook’s algorithms draw more users into a cascade of undifferentiated information, fewer will bother to distinguish between one source and another, especially as media sites rush to cover the same stories.
Of course, you can’t believe everything you see in your news feed any more than you might trust everything you watch on TV, or read in a book, or seem to recall in the privacy of your own mind. The fight for the right to be forgotten, then, isn’t just about erasing select gaffes or embarrassments; it’s about determining who controls the way we appear online, and why.
When the Internet fails to convey the reality just beyond the screen—and it’s constantly doing this—you can be sure that the misconception will linger.
I’m no psychotherapist—but Google thought I was
The problems of Internet memory are nothing new, but it wasn’t until May of this year, when privacy advocates scored a major win in the European Union Court of Justice, that we began to recognize their confluence. Mugshot blackmail, revenge porn, defamatory blog posts, industrial espionage, and astroturfing have deceived and damaged countless people, and attempts to suppress offending content can run up against the Streisand Effect, which is as prevalent online as it has been in courtrooms: When a judge tells a jury to disregard inadmissible testimony or evidence, they’re more likely to fold it into their perspective. Even simple misinformation takes a toll.
“I switched my phone number to a San Francisco local number about two years ago,” sportswriter Lindsey Adler told The Kernel in an email. Afterward, she wrote, calls from unrecognized numbers started pouring in. Googling her own number, she encountered a business listing for a psychotherapist named Kenneth, whose office had evidently used the number until it was redirected to her. “I realized I was receiving calls that were probably sensitive in nature, which quickly overshadowed the annoyance I’d felt. I remembered the day I called my most recent therapist’s office, scared, sad, and really in need of professional help.”
Should we desire a flat, transparent record of global information, then perhaps the right to be forgotten is a wrinkle we ought best avoid.
Adler continued: “It sickened me to realize I could be fielding those sort of calls, so after finding a new, 650-area-code number, I called and explained the situation to his bored secretary. She told me I should just contact Google, which is of course an impossible task. I tried to report the inaccuracy through their business pages, but it told me I had to be the business owner to make changes to the listing.”
Indeed, there are ample resources available for a business owner who wants to modify her Google listing, but an accidental victim of outdated or inaccurate search results has no formal route toward correcting the error. “I got nowhere with either of the other parties, and figured it’d resolve itself when a client complained about it to Kenneth’s people,” Adler said. “It did, mostly.” Kenneth’s new number made it onto Google, but other online business pages still list Adler’s digits, meaning strangers continue to dial her, possibly in times of acute crisis.
“I always elect to ignore, but I wince at the thought of it being a tragically misdirected call.”
You’ll never scrub the Internet clean
Adler’s case, though alarming, is straightforward: The damage was done, if indirectly, by Google itself. Given her difficulties in forcing a minor edit to the tech giant’s own database, you can imagine the complications that arise when a third party posts the misleading content. The European right-to-be-forgotten mandate is controversial in part because the data people want expunged is usually hosted elsewhere, with search engines merely pointing to it.
Consider the travails of architect John B. Wender, a New York architect who sued Louise Silberling, an editor at Cornell University’s Philosophical Review, for a slew of blog posts that accused him of being a abusive BDSM fetishist with an STD-riddled “micro penis.” Despite the vituperative attacks, the $1.25 million defamation suit crumbled this summer before Justice Anil C. Singh of State Supreme Court in Manhattan. “Opinions cannot be proven untrue,” Singh wrote in his dismissal, drawing on precedent to add that “loose, figurative, or hyperbolic statements, even if deprecating the plaintiff, are not actionable.”
If we’re moving books around in the reference library, we can put some of our own volumes in places where they’re bound to get noticed.
So Wender can’t simply nuke damaging text allegedly distributed by a jilted paramour, but were he a citizen of the European Union, he could petition Google to demote any links to it that show up in a search for his name—and the company would oblige, provided the employee reviewing the request found that the revenge blogs met the European court’s criteria of “irrelevant or outdated” info. (That this judgment falls to them is grist for further debate—why trust legal interpretations to a private organization?) The material itself would remain available, though harder to find. Silberling, or whoever, would receive a notice that her posts had been flagged and removed from search results, much as in a copyright claim.
By fighting to openly express her opinions—U.S. opposition to the right to be forgotten typically frames it as a First Amendment issue—Silberling may represent a staunch minority. The “micro penis” comments that made her vendetta headline-worthy reminded me of an incident months prior, when a message board I frequent was suddenly consumed with hatred for a wealthy young man whose online presence had become fodder for a thread about “Facebook Disasters.” Aside from speculating on the tininess of his genitalia, users ridiculed his nascent rap career and fumed that he had wriggled free from the major consequences of a brutal car accident caused by his reckless driving. Because these insults were often appended to the man’s name, he soon got wind of the public flaying and issued legal threats.
Hours after his cease-and-desist came down, all posts maligning him disappeared from the website, which couldn’t afford to lawyer up for a counterpunch. (A board administrator was not forthcoming with a comment for this article, either.) Could the sort of name-calling at issue—or, sure, let’s call it trolling—receive an adequate defense in court? The Wender lawsuit proves it. More often, however, all it takes is a flash of money and a lawyer’s letterhead to put the matter to rest. Few of us are willing to take a stand for vicious, sophomoric humor, least of all when it targets someone angry, vengeful, idle, and rich.
Casting a long shadow
Jeffrey Rosen, a law professor at the George Washington University, has written in the Stanford Law Review that the European concept of the right to be forgotten has deep “intellectual roots” in “le droit à l’oubli—or the ‘right of oblivion’—a right that allows a convicted criminal who has served his time and been rehabilitated to object to the publication of the facts of his conviction and incarceration.” In the United States, by comparison, “publication of someone’s criminal history is protected by the First Amendment,” leaving people vulnerable to one of the Web’s most pernicious scams: mugshot blackmail, a method for “monetizing humiliation.”
You needn’t be convicted of a crime for a mugshot to haunt your existence. The fact that you were ever arrested and photographed for police records—also available on sites like Mugshots.com, which can charge anywhere from $30 to $400 for photo removal—is enough to ensure you’ll never hear back from landlords and employers who run a cursory search on you. And while one service promises to wipe your mugshot from all such sites for $899, the prospective clients are typically those without the capital to invest in a rehabilitated image, struggling to find minimum-wage work that would enable them to pay that fee in the first place.
Google, thanks to journalistic pressure, has begun to crack down on the exploitative practice by burying mugshot results, with companies like PayPal, American Express, and Discover cutting ties with the industry—two developments certain to make this form of extortion far less lucrative.
This free-market realignment also encapsulates the ad hoc U.S. approach to protecting digital privacy, which doesn’t seem to extend to self-incriminating statements. In June, for example, the NYPD predicated its largest-ever gang raid on analysis of a million Facebook posts, many of which bore only a passing resemblance to evidence of a crime.
Online objectivity is a myth.
Twitter user @matt_T, an insurance risk analyst based in Atlanta, has a story that likewise suggests the potential for bureaucratic distortion of less-than-earnest remarks.
“I got fired from my last job [because] a guy I did fantasy football with searched my email and found my Twitter and had HR drug-test me,” he explained in a DM. “I thought we were friends, the company was downsizing, and it was convenient. I had no idea you could search by email until then.” What was @matt_T’s unforgivable offense? “Basically I was doing a series of tweets with a hashtag of #businessbro and made jokes about smoking weed.”
Getting fired for something you said online is hardly a new phenomenon, but knowing that a victimless Twitter joke can result in an internal corporate investigation (“HR tried to help but they had a random drug test rule,” @matt_T wrote) that leads to repercussions worse than joblessness (“And then they blocked my unemployment [because] I failed a drug test”) ought to give us pause.
It’s easy to blame @matt_T for his own misfortune, just as it’s inevitable that the hoi polloi will chastise revenge porn victims, or the entertainers exposed in Celebgate, for taking nude selfies in the first place. But do we really want to live in a world where self-censorship is the price of stability and success, where compromising photos can only be deleted through a stern application of the Digital Millennium Copyright Act, and potential lovers lie about their names to forestall a Google search that might end the flirtation?
More importantly, would such a world be altogether different from one that predates the Web?
Are we censoring history?
It’s time to acknowledge a breakdown in the analogy that opens this article: However fallible the human memory may be, we cannot pick and choose which parts of it will fade. The Internet, meanwhile, is a repository of information that can be regulated—in theory. “That,” according to Wikipedia founder Jimmy Wales, “is a very dangerous path to go down, and certainly if we want to go down a path where we are going to be censoring history, there is no way we should leave a private company like Google in charge of making those decisions.”
Yet “censoring history,” long a strategy of brutal dictatorships and Holocaust deniers, clearly predates Wikipedia, which has evolved a few ways of circumventing the age-old issue: The Twitter bot @congressedits reports any anonymous changes to an article made from a congressional IP address, and at least for a while, there existed a Wikipedia page titled “List of People who have Petitioned for the Right to be Forgotten [sic],” which predictably undermined that effort. Wikimedia also maintains an archive of the notices that search companies send when they “intend to indefinitely remove links to Wikimedia projects from their results,” and Hidden From Google collects all the links so far affected by the EUCJ ruling, leaving it up to readers “to decide whether our liberties are being upheld or violated” by the mandate.
Even simple misinformation takes a toll.
Most recently overridden were several old articles from the BBC. There’s a compilation of reader replies to a threatening video shot by Al Qaeda’s Ayman al-Zawahri, a report on the conviction of the woman who ran a major prostitution ring, and an account of a dispute over a lost dog. In light of these exiles, there’s definitely something to the Index on Censorship’s insistence that right-to-be-forgotten enforcement is tantamount to “marching into a library and forcing it to pulp books.”
Well, almost: Fulfilled requests don’t mean the destruction of data, just reshelving of it in some obscure corner of the library where a casual reader is unlikely to discover it. And in the U.S., where there are no official processes for these rearrangements, let alone legal guidelines, the wealthy elite have unsurprisingly gone about crafting their own.
Hide the dirt—or get it washed away
“It’s called reputation management,” said Joshua Ziering, founder of drone delivery startup QuiQui, in a Facebook chat with The Kernel. “I did it for years. The idea is you either make the dirt really, really hard to find, or you get it washed away with money or other tactics. I did some really creative things to help wealthy people deal with haters online.”
“Someone bought this [thisguysname].com, and obviously it ranked. They put up a small website, and it wasn’t terribly flattering. So I went and looked at other stuff this guy owned. I approached him as a domain broker ‘looking to hedge some investments.’ I ended up buying a bunch of domains from this guy to obscure my true intentions, and recover this one important domain”.
That was the least of it, Ziering said, mentioning that his clients—“high-powered executives, celebrities, very private people, centers of controversy, political figures,” and “people who have axes to grind with any of the above”—aptly demonstrated the “wildly imbalanced” opportunities for editing the Internet and closely curating a professional image. One paid him a monthly bonus if a link to Ripoff Report, a supposedly democratizing site that in fact works on the same basic extortionist principle as the mugshot industry, didn’t show up in his first page of search results. Others required what’s called “social stuffing,” which involved creating enough innocuous profiles and keyword-rich puff sites to dilute any unfavorable coverage.
Do we really want to live in a world where self-censorship is the price of stability and success?
Ziering, too, had also once thrown in the towel rather than face a costly copyright battle that might otherwise have come down to an argument for freedom of expression:
“I was hosting a site that showed pictures of pilots who crossed a picket line. American Airlines used the DMCA to make me remove it, claiming that because I used employee photos, they own the copyright to them. I had 24 hours to remove them or face a lawsuit plus automatic unplugging of my site at the server level. So it wasn’t even a fight; they just went over my head and someone pulls the cord in the data center.”
Moreover, Ziering explained, “after repeated copyright claims, Google will use that as a signal that the offender’s site shouldn’t rank as well in the future,” meaning that even spurious complaints will damage your site’s credibility. The list of dirty tactics seems endless.
Is there any way, in the continued absence of a right-to-be-forgotten mandate for the U.S., that rich people and corporations couldn’t already manipulate the Web to sell a chosen narrative?
“For one project, we anticipated that there might be some backlash,” Ziering responded. “In advance, we created our own opposition, [and] when push came to shove, we could craft the image of: ‘They reached out to us to address our concerns…’”
Another astroturfing move leverages the power of Google Autocomplete. If typing “Miles Klee” conjures the suggestion “Miles Klee con artist,” Ziering might create a website that appears high in the results for that phrase because it happens to be a softball interview in which I refute the characterization.
Ultimately, Ziering said, it’s the “borderless nature of the Web” that makes it easy to skirt any extant national laws. He recalled an attempted lawsuit by a multilevel marketing (that is, pyramid selling) firm that noticed he “was linking to where you could buy their product on eBay instead of joining their downline.” Furious, he sold the site to “someone in India” for $300. “[T]hey can’t sue him, and have no control over where he ends up sending their potential customers.”
“If we want to go down a path where we are going to be censoring history, there is no way we should leave a private company like Google in charge of making those decisions.” —Jimmy Wales
A small victory, then, for the right to remember. But the fact is, individuals have for years cultivated amnesia in an Internet audience that rarely notices the gaps.
“I can look at a set of [search] results and guess how much someone is paying monthly,” Ziering said, likening the reputation manager’s task to a puzzle portrait: “What can I fit where, and how big do the pieces need to be?”
If we’re moving books around in the reference library, we can put some of our own volumes in places where they’re bound to get noticed. Move that tome about my tax evasion to the basement, and leave this hagiography about my charity work in its place.
Don’t trust Google
Should we desire a flat, transparent record of global information, then perhaps the right to be forgotten is a wrinkle we ought best avoid, as Wales and others have argued. Yet the current landscape is far from unspoiled. Those with the requisite connections or cash can shape it (or alter any map that shows where the bodies are buried). Online objectivity is a myth; laissez-faire reasoning has allowed elites to consolidate control in the past, and it will again.
It doesn’t help that the rest of us place blind faith in Google. No matter what it comes out with, we harbor little doubt that we’re looking at the most relevant, timely results, even when this mosaic cuts against expectations. Training ourselves to question that assumption of baseline neutrality could recalibrate the discussion about the right to be forgotten, which currently revolves around the duty to preserve a transcript that has already had entire pages redacted, with the table of contents lost, and no record of who has checked it out over the years. New legislation can’t hope to improve circumstances we haven’t bothered to fully apprehend.
That is to say: Everybody forgets—but others sometimes do the forgetting for us.