Facebook Inc said it removed 1.5 million videos globally of the New Zealand mosque attack in the first 24 hours after the attack.
“In the first 24 hours we removed 1.5 million videos of the attack globally, of which over 1.2 million were blocked at upload…,” Facebook said in a tweet bit.ly/2HDJtPMlate Saturday.
The company said it is also removing all edited versions of the video that do not show graphic content out of respect for the people affected by the mosque shooting and the concerns of local authorities.
The death toll in the New Zealand mosque shootings rose to 50 on Sunday. The gunman who attacked two mosques on Friday live-streamed the attacks on Facebook for 17 minutes using an app designed for extreme sports enthusiasts, with copies still being shared on social media hours later.
New Zealand Prime Minister Jacinda Ardern has said she wants to discuss live streaming with Facebook.
A suspected gunman broadcast live footage on Facebook of the attack on one mosque in the city of Christchurch, mirroring the carnage played out in video games, after publishing a “manifesto” in which he denounced immigrants.
The video footage, posted online live as the attack unfolded, appeared to show him driving to one mosque, entering it and shooting randomly at people inside.