Even the language used to describe the attack before the fact framed it as an act of internet activism. In a post on 8chan, the shooting was referred to as a “real life effort post.” An image was titled “screw your optics,” a reference to a line posted by the man accused in the Pittsburgh synagogue shooting that later became a kind of catchphrase among neo-Nazis. And the manifesto — a wordy mixture of white nationalist boilerplate, fascist declarations and references to obscure internet jokes — seems to have been written from the bottom of an algorithmic rabbit hole.
It would be unfair to blame the internet for this. Motives are complex, lives are complicated, and we don’t yet know all the details about the shooting. Anti-Muslim violence is not an online phenomenon, and white nationalist hatred long predates 4Chan and Reddit.
But we do know that the design of internet platforms can create and reinforce extremist beliefs. Their recommendation algorithms often steer users toward edgier content, a loop that results in more time spent on the app, and more advertising revenue for the company. Their hate speech policies are weakly enforced. And their practices for removing graphic videos — like the ones that circulated on social media for hours after the Christchurch shooting, despite the companies’ attempts to remove them — are inconsistent at best.
We also know that many recent acts of offline violence bear the internet’s imprint. Robert Bowers, the man charged with killing 11 people and wounding six others at the Tree of Life synagogue in Pittsburgh, was a frequent user of Gab, a social media platform beloved by extremists. Cesar Sayoc, the man charged with sending explosives to prominent critics of President Trump last year, was immersed in a cesspool of right-wing Facebook and Twitter memes.
People used to conceive of “online extremism” as distinct from the extremism that took form in the physical world. If anything, the racism and bigotry on internet message boards felt a little less dangerous than the prospect of Ku Klux Klan marches or skinhead rallies.
Now, online extremism is just regular extremism on steroids. There is no offline equivalent of the experience of being algorithmically nudged toward a more strident version of your existing beliefs, or having an invisible hand steer you from gaming videos to neo-Nazism. The internet is now the place where the seeds of extremism are planted and watered, where platform incentives guide creators toward the ideological poles, and where people with hateful and violent beliefs can find and feed off one another.
So the pattern continues. People become fluent in the culture of online extremism, they make and consume edgy memes, they cluster and harden. And once in a while, one of them erupts.