These Trying Times
Thinking About Online Hate
August 5, 2019
As we have experienced all too often, an angry white man who traveled hundreds of miles to the border to kill brown people posted a manifesto to his chums at 8chan, an internet hangout created for people who don’t want any curbs on their speech and like to socialize with people who enjoy pushing the limits. (Google stopped linking to it in 2015 on the grounds it allows child pornography. Because freedom.) Racist mass murder is rehearsed there as part of extremist reality media, created by supposed “lone wolves” who form packs online encouraging violence.
What has happened to 8chan this week outlines the gnarly issues we have with the consequences of online hate. What should we do?
How about deplatforming? What you see on the internet has to be hosted somewhere and the domain has to be registered. These functions are typically handled by companies separate from the sites they host. The companies that provide those services can pull the plug, and they do. Back in 2010, Amazon stopped hosting WikiLeaks after leaked US embassy cables were posted there. Though Amazon faced pressure from US government officials, it claimed it was due to WikiLeaks violating their terms of service. It’s kind of troubling that expression can be so easily turned off by companies without any recourse available, though I totally understand a private business not wanting to be party to mass murder. In any case, deplatforming isn’t a solution: 8chan has already found another host.
What about government regulation? This is, like everything related to the first amendment, tricky. Most speech is not criminal under U.S. law, and it’s a good thing, too. There’s a clear risk the government might use regulations to suppress dissent. Besides, it’s not clear to me how U.S. regulations could easily apply to a site that doesn’t use persuasion algorithms that shape what you see, or how U.S. authority would reach the Philippines, where the owner of 8chan is based. I’m in favor of devising regulations to curb the power of tech corporations to scoop up personal information with very little restriction and would like to see Section 230 revisited if only to encourage the largest, most profitable platforms to take more responsibility for the harms they cause. But none of that will solve the 8chan issue.
We need better algorithms. That seems to be an article of faith among technologists, but so far the ability of artificial intelligence and clever code to sort through speech and identify actual hate speech or threats of violence is limited. In any case, the people who share mass murder manifestos on 8chan are not hobbled by algorithms, but they have an uncanny knack for manipulating us by manipulating other media.
We need to meet hate speech with more speech. This was a more persuasive argument before the rise of the attention economy and the vast increase in speech enabled and encouraged by tech companies. There seems to be little evidence that more speech works for combatting the rise of violent hate online. If anything, the ways social platforms encourage more speech seems to be making us more divided, more angry, and less thoughtful.
We need better mental health treatment. Yes, we do, but it’s completely irrelevant to this situation – white supremacy isn’t in the DSM.
We could . . . maybe . . . do something about guns? Yup. We must. Even small steps toward regulation would reduce the opportunity for mass murder – but it wouldn’t stop 8chan’s denizens from planning and celebrating it.
There is no easy answer. There are approaches we can explore, including encouraging corporate responsibility, sensible regulation, better algorithms, making it harder to kill so many people in a matter of seconds, reducing harm through promoting racial literacy in tech, and no doubt things I haven’t thought of. But in the end it may come down to people face to face, person to person, in city hall and in the voting booth facing our racism, embracing anti-racism, and acting on it – doing what it takes IRL to make it less easy for racist radicalization to happen.