One Year After Gunman Livestreamed New Zealand Massacre, Videos of the Attack Still Appear on Facebook
Facebook Promised to Remove the Videos from its Platform
FOR IMMEDIATE RELEASE: March 12, 2020
Contact: Bryan Dewan, firstname.lastname@example.org, 202.780.5750
WASHINGTON, D.C. – Today, Campaign for Accountability (CfA), a nonprofit watchdog group that runs the Tech Transparency Project (TTP), released a report identifying eight videos of the gruesome Christchurch, New Zealand, massacre that are still on Facebook despite the company’s promises to remove them.
CfA Executive Director Daniel Stevens said, “Facebook pledged to remove these videos and rid its platform of this horrific content. But a year later, they’re still posted and easily available. Facebook’s promises are about as reliable as a lot of news posted on the platform.”
On March 15, 2019, a heavily armed gunman used Facebook to livestream a mass shooting in Christchurch, New Zealand that left 51 people dead and sparked an international outcry over violence on social media. Facebook removed the livestream after it ended, but not before it was copied many times. Other users subsequently uploaded the video to Facebook and other platforms including YouTube, Twitter, and Reddit.
Three days after the mass murder, Facebook said it had removed 1.5 million copies of the video in the first 24 hours after the massacre. The company later said it had removed 4.5 million pieces of content related to the attack. Furthermore, Facebook said it “hashed” the video in order to speed up its efforts to automatically detect and delete the video.
Nevertheless, a year after the Christchurch shooting, TTP was able to identify eight videos of the massacre still available on Facebook. All of the videos are clips of the livestream, and some were uploaded by accounts in languages other than English, including Polish, Arabic and Turkish. Two of the videos are preceded by an advisory warning, but Facebook allows users to view them anyway. Many of the videos appear to be manipulated or recorded from a screenshare, which experts have said is a way for people to get around existing content moderation algorithms.
Mr. Stevens continued, “Time and time again, Facebook promises to fix its platform, yet the very worst content humanity can produce is widely available on its site. The only way for Facebook to actually prevent videos like this from spreading on its platform is to change its business model, which the company is not willing to do. It’s time for regulators to confront the reality that Facebook cannot control its own platform.”
Campaign for Accountability is a nonpartisan, nonprofit watchdog organization that uses research, litigation, and aggressive communications to expose misconduct and malfeasance in public life and hold those who act at the expense of the public good accountable for their actions.