This is an archived article and the information in the article may be outdated. Please look at the time stamp on the story to see when it was last updated.

Horrific video of the mosque massacre in Christchurch was viewed live fewer than 200 times on Facebook but that was enough to unleash it across the internet.

Now New Zealand, other governments and business leaders are calling for Facebook, Google and Twitter to do much more to rid their platforms of extremist content.

Vodafone and two other telecommunications operators, which provide internet access for most New Zealanders, said on Tuesday they want Facebook CEO Mark Zuckerberg, Twitter CEO Jack Dorsey and Google CEO Sundar Pichai to take part in an “urgent discussion” on how to keep harmful content off their platforms.

The three US tech companies have faced heavy criticism after they failed to identify and stop the spread of a video of Friday’s attack in which 50 people at two mosques were killed.

The CEOs of Vodafone New Zealand, Spark and 2degrees said they had taken the unprecedented step of jointly identifying and suspending access to sites that were hosting video footage taken by the attacker. They called on authorities to require tech companies to take down terrorist-linked content within a specific period of time and fine them if they fail to do so.

“Although we recognize the speed with which social network companies sought to remove Friday’s video once they were made aware of it, this was still a response to material that was rapidly spreading globally and should never have been made available online,” they said in an open letter published on their company websites.

Germany introduced a law in 2018 that gives authorities the power to fine social media platforms if they fail to quickly remove hate speech. And the European Commission is considering rules that would require platforms to remove terror content within an hour of it being flagged by authorities or risk fines up to 4% of global revenue.

‘Publisher not postman’

More world leaders are demanding that the tech companies step up their game.

New Zealand Prime Minster Jacinda Ardern said Tuesday that her government will investigate the role social media played in the deadly attack.

“We cannot simply sit back and accept that these platforms just exist and that what is said on them is not the responsibility of the place where they are published,” Ardern said in a speech to parliament.

“They are the publisher. Not just the postman,” she said. “There cannot be a case of all profit no responsibility.”

Australian Prime Minister Scott Morrison on Tuesday criticized the “continuing and unrestricted role” played by internet technology in the New Zealand shooting and other terrorist attacks. He laid out his concerns in an open letter to Japanese Prime Minister Shinzo Abe, who this year holds the presidency of the G20, an organization that brings together the world’s biggest economies.

“It is unacceptable to treat the internet as an ungoverned space,” Morrison said in the letter, calling for the issue to be discussed at the G20 summit in Osaka in June. Governments around the world need to ensure that tech firms filter out and remove content linked to terrorism, and are transparent about how they do so, he said.

Initial video viewed 4,000 times on Facebook

The tech companies are still scrambling to take down footage of the attack.

YouTube said Monday that it removed tens of thousands of videos and terminated hundreds of accounts “created to promote or glorify the shooter.”

The volume of related videos was “unprecedented both in scale and speed,” the Google-owned platform said in a statement.

YouTube said it took a number of steps including automatically rejecting footage of the violence and temporarily suspending the ability to filter searches by upload date.

Facebook said in a blog post Monday that the first user report of the violent livestream came 29 minutes after it had started, about 12 minutes after the live broadcast had ended. The video was viewed fewer than 200 times live, but it was viewed about 4,000 times before it was taken down.

That was enough for the graphic video to be copied and then re-uploaded millions of times to multiple platforms, including Facebook.

Twitter declined to provide details on steps it is taking to remove the video from its platform.