20 March 2019

A Mass Murder of, and for, the Internet

By Kevin Roose

Before entering a mosque in Christchurch, New Zealand, the site of one of the deadliest mass murders in the country’s history, a gunman paused to endorse a YouTube star in a video that appeared to capture the shooting.

“Remember, lads, subscribe to PewDiePie,” he said.

To an untrained eye, this would have seemed like a bizarre detour.

But the people watching the video stream recognized it as something entirely different: a meme.

Like many of the things done before the attack on Friday — like the posting of a 74-page manifesto that named a specific internet figure — the PewDiePie endorsement served two purposes. For followers of the killer’s videostream, it was a kind of satirical Easter egg. “Subscribe to PewDiePie,” which began as a grass-roots online attempt to keep the popular YouTube entertainer from being dethroned as the site’s most-followed account, has morphed into a kind of all-purpose cultural bat signal for the young and internet-absorbed.


For everyone else, it was a booby trap, a joke designed to ensnare unsuspecting people and members of the media into taking it too literally. The goal, if there was one, may have been to pull a popular internet figure into a fractious blame game and inflame political tensions everywhere.

In a tweet early Friday morning, PewDiePie, whose real name is Felix Kjellberg, said, “I feel absolutely sickened having my name uttered by this person.”

New Zealand authorities have identified an accused gunman as Brenton Harrison Tarrant, 28, but it remains unclear if he acted alone.

The details that have emerged about the Christchurch shooting — at least 49 were killed in an attack on two mosques — are horrifying. But a surprising thing about it is how unmistakably online the violence was, and how aware the shooter on the videostream appears to have been about how his act would be viewed and interpreted by distinct internet subcultures.

In some ways, it felt like a first — an internet-native mass shooting, conceived and produced entirely within the irony-soaked discourse of modern extremism.

The attack was teased on Twitter, announced on the online message board 8chan and broadcast live on Facebook. The footage was then replayed endlessly on YouTube, Twitter and Reddit, as the platforms scrambled to take down the clips nearly as fast as new copies popped up to replace them. In a statement on Twitter, Facebook said it had “quickly removed both the shooter’s Facebook and Instagram accounts and the video,” and was taking down instances of praise or support for the shooting. YouTube said it was “working vigilantly to remove any violent footage” of the attack. Reddit said in a statement that it was taking down “content containing links to the video stream or manifesto.”

Hosted by Michael Barbaro, produced by Annie Brown, Paige Cowett, Michael Simon Johnson and Jonathan Wolfe, and edited by Lisa Tobin

One of the deadliest mass shootings in the country’s history bore the stamp of online extremism.

Michael Barbaro

From The New York Times, I’m Michael Barbaro. This is “The Daily.” Today: The death toll from a mass shooting targeting Muslims in New Zealand rose from 49 to 50 over the weekend, after officials found another body at the Al Noor Mosque, where most of the deaths occurred. Kevin Roose on why this attack was made by and for the internet. It’s Monday, March 18.

Archived Recording 1

Farid, would you mind just telling us one more time of what happened in the mosque?

Archived Recording (Farid Ahmed)

When shooting started, it started from the hallway. So I could hear, so [IMITATING GUNFIRE] Then magazine is finished. Then he refill again and came back again.

Archived Recording 2

I saw all the plastering coming down from the wall and the ceiling.

Archived Recording 3

O.K.

Archived Recording 4

And that was when I knew it was from the gun shot. So immediately —

Archived Recording 5

When he was shooting at me — and there’s a fence here. He was shooting at me. I ducked here and I come here. And I find the gun somewhere here and a dead body here as well.

Archived Recording 6

I feel now — I repeated the story a lot. But this is a good idea to say that —

Michael Barbaro

Over the weekend, through dozens of interviews with survivors, a story began to emerge of what happened on Friday inside the mosques in Christchurch.

Archived Recording 1

Was that your regular mosque or were you visiting that mosque?

Archived Recording (Farid Ahmed)

Regular mosque. Yeah.

Michael Barbaro

The shooting began at the Al Noor Mosque, where Farid Ahmed and his wife, Husna, who had moved to New Zealand from Bangladesh, were attending afternoon prayer.

Archived Recording (Farid Ahmed)

The ladies’ room was on the right-hand side. So all the ladies were there. And my wife is always a leading person for ladies. She had a philosophy. She always used to tell me, I don’t want to hold any position. And I want to prove that you don’t need to have any position to help people. She was like a magnet. And exactly the same thing happened. The shooting started. She started instructing several ladies and children to get out. And she was screaming, Come this way, hurry up,” this and that. You know, she was doing all these things. And then she took many children and ladies into a safe garden. Then she was coming back, checking about me, because I was in the wheelchair.

Archived Recording

Do you mind me asking why you’re in a wheelchair?

Archived Recording (Farid Ahmed)

I was run over by a car. He was the drunk driver. And it was 1998 and it happened.

Archived Recording

I’m sorry.

Archived Recording (Farid Ahmed)

It’s O.K.

Archived Recording

So she went out of the mosque and then she came back in?

Archived Recording (Farid Ahmed)

Yeah. She was coming back. And once she was approaching the gate, then she was shot.

Archived Recording

She came back into to fetch you?

Archived Recording (Farid Ahmed)

Yes. Yes.

Michael Barbaro

Farid learned hours later that Husna was one of the 42 people police say were killed at the mosque.

Archived Recording (Farid Ahmed)

So she was busy with saving lives, you know, forgetting about herself. And that’s what she is. She always has been like this.

Michael Barbaro

Six minutes after firing the first shot, and as police raced toward Al Noor Mosque, the shooter drove to a second mosque, the Linwood Mosque, four miles east.

Damien Cave

And tell me — I’m sorry, what was your name?

Abdul Aziz

Abdul Aziz.

Damien Cave

And the mosque that you go to, is it mixed, Pakistani? I mean, that mosque — who was there that day?

Abdul Aziz

That mosque, we got from every race, from Malaysia, from Philippines, from Afghanistan, from every sort of country.

Michael Barbaro

My colleague Damien Cave spoke with Abdul Aziz, who was praying at the Linwood Mosque with his four sons when he heard gunshots. Aziz ran toward the shots, grabbing the first thing he could find, a credit card machine, which he flung at the attacker. The shooter dropped a gun, and Aziz picked it up.

Abdul Aziz

And I pick up the gun and I checked that it had no bullets. And I was screaming to the guy, “Come here! I’m here.” I just wanted to put more focus on me than go inside the masjid. But unfortunately he just got himself to the masjid, and I heard more shooting sound. And I see the shooting inside the masjid.

Michael Barbaro

Moments later, when the gunman went to his car to retrieve more weapons, Aziz followed him.

Abdul Aziz

This guy tried to get more gun from his car. When he see me, I’m chasing with the gun. He sat on his car. And I just got that gun and throw in his window like an arrow and blast his window. And he thought I probably shot him or something. And the guns come back and just, he drives off.

Michael Barbaro

Aziz used the gun to shatter the gunman’s car window, which many witnesses believe is what prompted him to speed away rather than re-enter the mosque and kill more people.

Abdul Aziz

Any brother would do the same thing. So if you was there, you would do the same thing.

Damien Cave

Have you — can I ask you —

Michael Barbaro

Minutes later, video shows the suspect being pulled by police from his car two and a half miles down the road, where two more guns and homemade explosives were also found.

Archived Recording (Jacinda Ardern)

I want to speak specifically about the firearms used in this terrorist act. They were two semi-automatic weapons and two shotguns.

Michael Barbaro

On Sunday, New Zealand’s prime minister, Jacinda Ardern, said that the suspect, an Australian citizen, would be tried in New Zealand, and that her government would meet today to discuss the country’s gun laws.

Archived Recording (Jacinda Ardern)

I can tell you one thing right now. Our gun laws will change.

Michael Barbaro

Funerals for all 50 victims are expected to be held in the coming days.

Archived Recording (Jacinda Ardern)

As the police commissioner confirmed this morning, 50 people have been killed and 34 people remain in Christchurch Hospital, 12 of them in the intensive care unit in critical condition. A 4-year-old girl remains in critical condition at Starship Hospital in Auckland.

Michael Barbaro

Islamic burial rituals typically require bodies to be buried as soon as possible, and usually within 24 hours. But New Zealand authorities say that the process of identifying the victims and returning them to their families could take several more days.

Archived Recording (Jacinda Ardern)

It is the expectation that all bodies will be returned to families by Wednesday. I want to finish by saying that while the nation grapples with a form of grief and anger that we have not experienced before, we are seeking answers.

Michael Barbaro

Kevin, I want to talk to you about the moments before this mass shooting began. What do you know about those?

Kevin Roose

Well, what we know comes from a video that was live-streamed on Facebook while this was all happening by the gunman. He taped himself in the car on his way over to the mosque, listening to music, talking. And right before he gets out of the car and goes into the mosque, he pauses and says, “Remember, lads, subscribe to PewDiePie.” And when I heard that, I just, like, I knew, oh, this is something different than we’re used to.

Michael Barbaro

What do you mean? What is PewDiePie, and why does that reference matter?

Kevin Roose

So PewDiePie is this really popular YouTube personality. He has the most subscribers of anyone on YouTube. Some people think he’s offensive. Some people really like him. He’s got this whole fan base. And a few months ago, his fans started sort of spamming this phrase, “subscribe to PewDiePie,” in an attempt to kind of keep him from being eclipsed by another account that was going to have more followers than him.

Michael Barbaro

O.K.

Kevin Roose

So it sort of became this competition, and then it became this joke. And now “subscribe to PewDiePie” is just kind of like a thing that people say on certain parts of the internet. It’s just kind of like a signifier, like, I understand the internet, you understand the internet. This is how we’re going to signal to each other that we understand the internet.

Michael Barbaro

O.K. And this is what he’s signaling in saying that?

Kevin Roose

Yeah. So I have that in my head. And then I see all these other signs that something is weirdly kind of internet-y about all of this. Like, there’s this post on 8chan, which is kind of like a scummy message board that lots of extremists and weirdos go on. And in the post, the gunman links to the Facebook stream before it happens.

Michael Barbaro

The Facebook stream that he will record of the massacre itself.

Kevin Roose

Exactly. And then he pastes a bunch of links to copies of his manifesto. He has a 74-page manifesto that he wrote. And some of this stuff was fairly standard hard-right ideology, very fascist, very white nationalist. Muslims are kind of like the primary target for white nationalists around the world, calling them invaders, saying they’re taking over. You know, this is a sort of classic white nationalist trope. And then there was all this kind of meta-humor, saying that he was radicalized by video games, which is another thing that internet extremists love to sort of troll the media with. Like, you know, he posted previews of his gun on Twitter. The whole thing just kind of felt like it just set this shooting up as, like, almost an internet performance, like it was native to the internet, and it was born out of and aimed into this culture of extremely concentrated internet radicalism.

Michael Barbaro

But underneath it all is white nationalism, white supremacy, whatever you want to call it, a kind of racism that has always existed. So why does the internet’s role in this feel especially different to you?

Kevin Roose

I want to make clear that this is not just a tech story, right. There’s a real core of anti-Muslim violence here, Islamophobia, far-right ideology. That’s all very, very important, and we should focus there. But I think there’s this other piece that we really need to start grappling with as a society, which is that there’s an entire generation of people who have been exposed to radical extremist politics online, who have been fed a steady diet of this stuff. It’s transformed by the tools that the internet provides. So I’ve talked to a lot of white nationalists, unfortunately. And when I asked them how they got into this, a lot of them will say, I found a couple of videos on YouTube. And then I found some more videos on YouTube, and it kind of started opening my eyes to this ideology. And pretty soon, you’re a white nationalist. And that’s different from, historically, how extremism has been born. I mean —

Michael Barbaro

What do you mean?

Kevin Roose

You know, if you go to the library and you take out a book about World War II, right as you’re about to finish it, the librarian doesn’t say, here, here’s a copy of “Mein Kampf.” You might like this. There’s not this kind of algorithmic nudge toward the extremes that really exists on social media and has a demonstrated effect on people.

Michael Barbaro

Walk me through this algorithmic nudge. I want to make sure I understand what you’re referring to.

Kevin Roose

This is pretty specific to YouTube, but that’s where a lot of this stuff happens. So on YouTube there’s this recommendations bar. And after a video plays, another one follows it. And historically, the way that this algorithm that chose which video came next worked is it would try to keep you on the site for as long as possible, try to maximize the number of videos you watch, the amount of time you spent, which would maximize the ad revenue. Right. It would maximize lots of things. And so it turned out that what kept people on the site for longer and longer periods of time was gradually moving them toward more extreme content. You start at a video about spaceships, and you’d end on something that was questioning whether the moon landing was a hoax. Or you’d start at a video about some piece of U.S. history and, you know, five videos later you’re at kind of a 9/11 conspiracy theory video. Just these kind of gradual tugs toward the stuff that the algorithm decides is going to keep you hooked. And in a lot of cases, that means making it a little more extreme.

Michael Barbaro

And what’s the white nationalist version of this nudge?

Kevin Roose

There’s a ton of white nationalism on YouTube. YouTube, from the conversations I’ve had with people in this movement, is sort of central to how these ideas spread. Like, you start watching some videos about politics. Maybe they’re about Trump. Then you start watching some videos by sort of more fringey, kind of far-right characters, and all of a sudden you are watching someone’s video who is espousing open white nationalism. And you’re not exactly sure how you got there, but you keep watching. And for some percentage of people, you internalize that.

Michael Barbaro

So it’s a kind of computer-driven on-ramp or onboarding.

Kevin Roose

Yeah. And this has been studied. Like, this is a well-documented phenomenon, and YouTube has done some things to try to fix the algorithm and make it so that it’s not sending you down these rabbit holes. But it’s still a pretty observable effect.

Michael Barbaro

And what’s your understanding of why these platforms didn’t act years ago to police, to delete these hate-filled videos, this content that through these algorithmic nudges you described directs people further and further towards extremism?

Kevin Roose

They had no reason to. I mean, they were making a lot of money. They saw their responsibility as providing a platform for free speech. They were very hesitant to kind of seem like they were censoring certain political views. They were committed to free speech. And I think that’s kind of the original sin that’s baked into all of this. It’s like, part of how this was born is this idea that we just provide the platform, and if people signal to us that they like something, we’ll show them more of it. And maybe we’ll show them something that pushes the envelope a little bit more. And we’re not optimizing for truth. We’re not optimizing for things that we think are healthy for people. We’re just giving them what they want. And they’re trying to change that now, some of them. There’s a reckoning now where these platforms have come to understand that this is the role that they’ve played and that they’re trying to correct it. But there’s a lot of people who have already been sucked up into this world, who have been radicalized, and who may not be coming back. It’s going to be very, very tricky to slow the thing that has been set into motion. And I don’t even know if it’s possible.

Michael Barbaro

At this point.

Kevin Roose

Yeah. These platforms played a pivotal role, have played, are playing a pivotal role in how these extremist groups gather momentum and share their ideas and coalesce into real movements and grow. And, like, that’s the part that I don’t think they’ve completely reckoned with, and I don’t think we’ve completely reckoned with. I think we’re still sort of coming to terms with the fact that there’s this pipeline for extremism. And we know how it runs. We know where it happens. We know who’s involved. And we know that sometimes it has these devastating, tragic consequences. What I’ve been thinking is just how inevitable this feels.

Michael Barbaro

What do you mean?

Kevin Roose

I’ve been watching these people in these kind of dark corners of the internet multiplying and hardening and becoming more extreme, and, like, it was inevitable. This is the nightmare. Right? This is the worst possible version of something that could happen and be broadcast on the internet. And it’s not getting better. And it’s going to be with us for a long time.

Michael Barbaro

But it also strikes me that in a way, and in a pretty awful way, this gunman and the way he has approached this massacre is kind of reflecting back how the internet functions. Because I’m thinking about him making a video of this attack, which, in a sense, means he’s making content that feeds that loop that we’re discussing, perhaps feeds this algorithm that possibly fed him — that he’s basically putting something back into the system.

Kevin Roose

Yeah. And I saw this happening on these platforms, like, in real time. So —

Michael Barbaro

You mean on Friday?

Kevin Roose

Yeah. So if you went onto 8chan, which is the website where all the stuff was posted, the comments below this post were all about, let’s save these videos so that we can re-upload them somewhere else —

Michael Barbaro

Wow.

Kevin Roose

— in case 8chan gets taken down. Let’s spread this. Let’s seed this all over the internet. I mean, there’s no doubt in my mind that this guy was very aware of how his video and his manifesto would kind of filter through the internet and get refracted and picked up and analyzed. This was a very deliberate act, not only of murder and violence, but also of media creation. I mean, this was, in a way, engineered for internet virality.

Michael Barbaro

And then it did go viral.

Kevin Roose

Yes. Twitter and Facebook and YouTube, all the platforms, tried to take down the video as soon as it popped up, but it just kept popping back up. It’s very hard to contain. So it’s still out there. I mean, yeah, I’m looking at, right now, something posted, you know, six hours ago. It’s the video of the shooting, and it’s still up. And I don’t think it’ll ever fully disappear.

Michael Barbaro

Kevin, thank you very much.

Kevin Roose

Thank you.

⚠There was an error loading the player. Please refresh to try again.

Even the language used to describe the attack before the fact framed it as an act of internet activism. In a post on 8chan, the shooting was referred to as a “real life effort post.” An image was titled “screw your optics,” a reference to a line posted by the man accused in the Pittsburgh synagogue shooting that later became a kind of catchphrase among neo-Nazis. And the manifesto — a wordy mixture of white nationalist boilerplate, fascist declarations and references to obscure internet jokes — seems to have been written from the bottom of an algorithmic rabbit hole.

It would be unfair to blame the internet for this. Motives are complex, lives are complicated, and we don’t yet know all the details about the shooting. Anti-Muslim violence is not an online phenomenon, and white nationalist hatred long predates 4Chan and Reddit.

But we do know that the design of internet platforms can create and reinforce extremist beliefs. Their recommendation algorithms often steer users toward edgier content, a loop that results in more time spent on the app, and more advertising revenue for the company. Their hate speech policies are weakly enforced. And their practices for removing graphic videos — like the ones that circulated on social media for hours after the Christchurch shooting, despite the companies’ attempts to remove them — are inconsistent at best.

We also know that many recent acts of offline violence bear the internet’s imprint. Robert Bowers, the man charged with killing 11 people and wounding six others at the Tree of Life synagogue in Pittsburgh, was a frequent user of Gab, a social media platform beloved by extremists. Cesar Sayoc, the man charged with sending explosives to prominent critics of President Trump last year, was immersed in a cesspool of right-wing Facebook and Twitter memes.

People used to conceive of “online extremism” as distinct from the extremism that took form in the physical world. If anything, the racism and bigotry on internet message boards felt a little less dangerous than the prospect of Ku Klux Klan marches or skinhead rallies.

Now, online extremism is just regular extremism on steroids. There is no offline equivalent of the experience of being algorithmically nudged toward a more strident version of your existing beliefs, or having an invisible hand steer you from gaming videos to neo-Nazism. The internet is now the place where the seeds of extremism are planted and watered, where platform incentives guide creators toward the ideological poles, and where people with hateful and violent beliefs can find and feed off one another.

Interested in All Things Tech?

The Bits newsletter will keep you updated on the latest from Silicon Valley and the technology industry.

So the pattern continues. People become fluent in the culture of online extremism, they make and consume edgy memes, they cluster and harden. And once in a while, one of them erupts.

In the coming days, we should attempt to find meaning in the lives of the victims of the Christchurch attack, and not glorify the attention-grabbing tactics of the gunman. We should also address the specific horror of anti-Muslim violence.

At the same time, we need to understand and address the poisonous pipeline of extremism that has emerged over the past several years, whose ultimate effects are impossible to quantify but clearly far too big to ignore. It’s not going away, and it’s not particularly getting better. We will feel it for years to come.

No comments: