Toronto, March 18:  After coming in for criticism for its delayed action on the Christchurch mosque attacks, Facebook said it has worked hard in the past 24 hours to remove over 1.5 million videos of the New Zealand mosque attack. "In the first 24 hours we removed 1.5 million videos of the attack globally, of which over 1.2 million were blocked at upload...," Facebook said in a tweet late Saturday.

Facebook said it could block the upload due to its system automatically recognising the footage of the shooting and a further 300,000 clips were removed by moderators after going live.

The social network said it was removing any clip featuring the gunman’s footage, even if that would not normally break its rules. “Out of respect for the people affected by this tragedy and the concerns of local authorities, we’re also removing all edited versions of the video that do not show graphic content,” said Mia Garlick of Facebook New Zealand.

Christchurch mosque shooter Brenton Tarranton live streamed his attack on two mosques in New Zealand in which he shot dead 50 worshippers who had come to the mosques for the Friday prayers. The terror attack live-stream video went on for 17 minutes and was done using an app designed for extreme sports enthusiasts. As  news of the terror attack on the mosques spread, the videos went viral across the globe, with the video being shared even a day after the attack.

New Zealand’s Prime Minister Jacinda Ardern has said she will seek talks with Facebook on its efforts to stop circulation of the video and the way it manages its live-stream service. “This is an issue that goes well beyond New Zealand but that doesn’t mean we can’t play an active role in seeing it resolved,” said Ardern. “This is an issue I will look to be discussing directly with Facebook.”

(The above story first appeared on LatestLY on Mar 18, 2019 01:28 AM IST. For more news and updates on politics, world, sports, entertainment and lifestyle, log on to our website latestly.com).