"Facebook cannot be trusted. They are morally bankrupt pathological liars who enable genocide (Myanmar), facilitate foreign undermining of democratic institutions," NZ Privacy Commissioner John Edwards posted to Twitter last night, in his most pointed attack on the social network yet.
"[They] allow the live streaming of suicides, rapes, and murders, continue to host and publish the mosque attack video, allow advertisers to target 'Jew haters' and other hateful market segments, and refuse to accept any responsibility for any content or harm. They #DontGiveAZuck," Edwards said in a follow-up tweet.
In his first post-Christchurch shootings interview on Friday NZT, Facebook chief executive Mark Zuckerberg poured cold water on even a slight delay for Facebook Live, saying it would "break" the service which is often used for two-way communication with birthdays and other occasions (the Herald pointed out that video chat confined to a set group of people covers such events fine, no public broadcast required).
In an interview with RNZ this morning, Edwards said this "greater good" argument was "disingenuous" because "he [Zuckerberg] can't tell us - or won't tell us, how many suicides are livestreamed, how many murders, how many sexual assaults.
"I've asked Facebook exactly that last week and they simply don't have those figures or won't give them to us."
Edwards also asked Facebook to hand over names of people who shared the alleged gunman's video to NZ Police. Facebook refused. The clip has been banned by NZ's Chief Censor, making it illegal to view or share at any point since it was released.
"The legal protection they have - the reason they have been able to launch an unsafe product and escape any liability is the Communications Decency Act in the US which says if you are a platform, a carrier, you have no liability for the content, but I think what we're seeing around the world is a push-back on that," Edwards said.
In May last year, Facebook, quietly changed its terms of service to move its NZ users from being under Irish privacy law (which was about to fall under tough new EU privacy regulations) to lighter US privacy law. Edwards says its NZ operation should fall under NZ law.
"I think it would be very difficult for NZ to act [alone]," Edwards said.
"This is a global problem. The events that were livestreamed in Christchurch could happen anywhere in the world. Governments need to come together and force the platforms to find a solution
"It may be that regulating - as Australia has done just in the last week - could be a good interim way to get their attention."
Australia's tough new law threatens social media companies with fines up to 10 per cent of their revenue and up to three years jail for their executives if they fail to remove "abhorrent violent material expeditiously."
Asked for reaction to Edwards comments last night and this morning, a Facebook Australia-NZ spokesman referred the Herald to a transcript of Zuckerberg's ABC News interview, and COO Sheryl Sandberg's March 30 open letter in which she detailed efforts to stamp out copies of the gunman's video, and a clamp-down on hate content.
Facebook founder Mark Zuckerberg during his Friday NZT interview with ABC News' George Stephanopoulos. Photo / Getty.
The process is ongoing, On Friday NZT, New York-based researcher Eric Feinberg told the Herald he had found seven copies of the alleged gunman's clip on Facebook and five on the Facebook-owned Instagram.
A Facebook spokesman acknowledged the copies, but said they had been deleted the same day. Feinberg told the Herald he found more copies on Saturday NZT and this morning NZT.
Facebook's most recent Community Standards Enforcement Report, covering October 2017 to September 2018 says, "We took action on a total of 15.4 million pieces of content between July and September 2018; 97% of which we proactively found and took action on. The Report also includes measures on how many times violating content was seen on Facebook. We estimate that 0.23% to 0.27% of content views were of content that violated our standards for graphic violence between July and September 2018. In other words, of every 10,000 content views, an estimate of 23 to 27 contained graphic violence."
On Terrorist content, it says, "We removed 14.3 million pieces of terrorist content in the first three quarters of 2018. 99.5% of this content we surfaced and removed ourselves, before any user had flagged it to us."