During the first 24 hours after the fatal mass shooting in New Zealand, Facebook says it has removed 1.5 million videos uploaded by the attack and has blocked 1.2 million "by upload."
The company made an announcement in a Tweet followed by a previous announcement that it had been alerted by authorities and removed the alleged shooter's Facebook and Instagram accounts. Facebook speech art Mia Garlick says the company is also "to remove all edited versions of the video that do not display graphic content."
During the first 24 hours, we took away 1.5 million videos from the attack globally, of which over 1.2 million were blocked during upload …
– Facebook Newsroom (@fbnewsroom) March 17, 2019  We have come out on Facebook for further comments and will update this post if we hear back.
The terrorist attack seems to have been designed to become viral, with the alleged shooter releasing a manifesto referring to many individuals such as YouTuber Felix Kjellberg and Candace Owens, as well as white supremacist conspiracy theories. He also sent a 17-minute video to Facebook, Instagram, Twitter and YouTube, which led the message to go viral, just as all these companies have been working to prevent it from spreading.
The attack has caused social media to respond to such content: Facebook, Twitter and YouTube have been working to remove video clips. Reddit banned a subreddit called r / watchpeopie, while Valve began to remove tribute to the alleged shooter posted to user profiles.
However, Facebook's deletion of more than one million copies (and edited versions) of the video talks about the enormous challenge it has for moderating the site. In its quest for rapid growth, its efforts to reinforce its ability to monitor and remove content that is offensive, illegal or disruptive have been provided and allow suspects to use the platform to spread its message quickly. There have been other high-profile examples of where murder or terrorist attacks flowed on the platform. In fact, Facebook, which Facebook has worked to solve the problem, has used third-party contractors, some of which have been radicalized and traumatized by the act itself to take down such content.
After the attack, many world leaders have phoned Facebook for their role in spreading this type of content. According to Reuters New Zealand Prime Minister Jacinda Ardern indicated that she wanted to talk to the company about live streaming, while British teacher Jeremy Corbyn said such platforms had to act and raise the issue of regulation.