Livestreams of Mass Shootings: From Buffalo to New Zealand

Many sites tried to take down the videos when they were uploaded but they were overwhelmed. Facebook said Removed 1.5 million videos In the 24 hours following the incident, however, many people managed to avoid the search. On Reddit, the post featuring the video was viewed more than 1 million times before it was removed. Google said the speed at which the video was shared was faster than any previous crash, according to a New Zealand government report.

Over the next few days, some people began discussing ways to avoid the platform’s automated systems for keeping Christchurch video online. According to discussions seen by The Times on the Telegram on March 16, 2019, people who were part of a group associated with white supremacy were betting on ways to tamper with the video so that it would not be removed.

“Just change the opening,” wrote one user. “Its speed is 2x and the [expletive] They can’t find it. “

Within a few days, some clips of the shooting were posted on 4chan, a fringe online message board. In July 2019, a 24-second clip of the murder appeared on Rumble, according to a review by The Times.

In the coming months, the New Zealand government identified more than 800 variations of the original video. Officials told Facebook, Twitter, Reddit and other sites to dedicate more resources to remove them, according to a government report.

New copies or links to videos were uploaded online whenever the Christchurch shooting was in the news or on the anniversary of the event. In March 2020, almost a year after the shooting, there were about a dozen tweets on Twitter linking a variety of videos. More videos surfaced in August 2020 when the gunman was sentenced to life in prison.

Other groups jumped on the bandwagon to force tech companies to delete the video. Tech Against Terrorism, a United Nations-supported initiative that develops tech to detect extremist content, sent 59 warnings about Christchurch content to tech companies and file hosting services between December 2020 and November 2021, said Adam Headley, the group’s founder and director. That represents about 51 percent of the right-wing terrorist material that the group was trying to remove online, he said.

Similar Posts

Leave a Reply

Your email address will not be published.