Facebook are 'morally bankrupt, pathological liars' - NZ Privacy Commissioner

Authored by nzherald.co.nz and submitted by kidneydamage

"Facebook cannot be trusted. They are morally bankrupt pathological liars who enable genocide (Myanmar), facilitate foreign undermining of democratic institutions," NZ Privacy Commissioner John Edwards posted to Twitter last night, in his most pointed attack on the social network yet.

"[They] allow the live streaming of suicides, rapes, and murders, continue to host and publish the mosque attack video, allow advertisers to target 'Jew haters' and other hateful market segments, and refuse to accept any responsibility for any content or harm. They #DontGiveAZuck," Edwards said in a follow-up tweet.

NZ Privacy Commissioner John Edwards. Photo / File.

In his first post-Christchurch shootings interview on Friday NZT, Facebook chief executive Mark Zuckerberg poured cold water on even a slight delay for Facebook Live, saying it would "break" the service which is often used for two-way communication with birthdays and other occasions (the Herald pointed out that video chat confined to a set group of people covers such events fine, no public broadcast required).

In an interview with RNZ this morning, Edwards said this "greater good" argument was "disingenuous" because "he [Zuckerberg] can't tell us - or won't tell us, how many suicides are livestreamed, how many murders, how many sexual assaults.

"I've asked Facebook exactly that last week and they simply don't have those figures or won't give them to us."

Challenged about his comments on social media, Edwards said media had asked him to comment and he had done so. "I've got no personal agenda and I'm not grandstanding," he said. A spokesman for his office said there was no further comment on the tone of the Commissioner's remarks.

Edwards also asked Facebook to hand over names of people who shared the alleged gunman's video to NZ Police. Facebook refused. The clip has been banned by NZ's Chief Censor, making it illegal to view or share at any point since it was released.

"The legal protection they have - the reason they have been able to launch an unsafe product and escape any liability is the Communications Decency Act in the US which says if you are a platform, a carrier, you have no liability for the content, but I think what we're seeing around the world is a push-back on that," Edwards said.

In May last year, Facebook, quietly changed its terms of service to move its NZ users from being under Irish privacy law (which was about to fall under tough new EU privacy regulations) to lighter US privacy law. Edwards says its NZ operation should fall under NZ law.

"I think it would be very difficult for NZ to act [alone]," Edwards said.

"This is a global problem. The events that were livestreamed in Christchurch could happen anywhere in the world. Governments need to come together and force the platforms to find a solution

"It may be that regulating - as Australia has done just in the last week - could be a good interim way to get their attention."

Australia's tough new law threatens social media companies with fines up to 10 per cent of their revenue and up to three years jail for their executives if they fail to remove "abhorrent violent material expeditiously."

UK lawmakers are poised to follow suit.

In New Zealand, the focus has so far been on gun control over social media control.

"[Facebook] actually didn't have any systems to check for the events in Christchurch," said Edwards, who relayed that he met with Facebook reps last week and they answered "no" to his question about whether any safeguards had been put in place that would prevent a repeat of the March 15 livestream.

"Maybe a delay on livestreaming would be a good thing as an interim measure until they can sort out their AI. It could be that they just need to turn it off altogether. It is a technology that's capable of causing great harm," the Privacy Commmissioner said.

Asked for reaction to Edwards comments last night and this morning, a Facebook Australia-NZ spokesman referred the Herald to a transcript of Zuckerberg's ABC News interview, and COO Sheryl Sandberg's March 30 open letter in which she detailed efforts to stamp out copies of the gunman's video, and a clamp-down on hate content.

Facebook founder Mark Zuckerberg during his Friday NZT interview with ABC News' George Stephanopoulos. Photo / Getty.

The process is ongoing, On Friday NZT, New York-based researcher Eric Feinberg told the Herald he had found seven copies of the alleged gunman's clip on Facebook and five on the Facebook-owned Instagram.

A Facebook spokesman acknowledged the copies, but said they had been deleted the same day. Feinberg told the Herald he found more copies on Saturday NZT and this morning NZT.

• Privacy Bill: Commissioner misses out on two wish-list items

Facebook's most recent Community Standards Enforcement Report, covering October 2017 to September 2018 says, "We took action on a total of 15.4 million pieces of content between July and September 2018; 97% of which we proactively found and took action on. The Report also includes measures on how many times violating content was seen on Facebook. We estimate that 0.23% to 0.27% of content views were of content that violated our standards for graphic violence between July and September 2018. In other words, of every 10,000 content views, an estimate of 23 to 27 contained graphic violence."

On Terrorist content, it says, "We removed 14.3 million pieces of terrorist content in the first three quarters of 2018. 99.5% of this content we surfaced and removed ourselves, before any user had flagged it to us."

GWinterborn on April 8th, 2019 at 02:51 UTC »

Once again, for people who don’t read the articles [Edit for people who feel I’m misleading: My interpretation of the intention of the individuals discussed in the article]:

Facebook’s being asked to abolish the Live Streaming feature, because some people have used it to showcase bad things.

In essence, they’re being asked to abolish a media platform on which ordinary people can stream content live, wholesale. [While options are suggested, the strongest language leans toward removal of the service, or altering the service to the point it’s not exactly live-streaming]. Their refusal to do that is what got the commentary featured in the headline.

Edit:

For clarity.

Several things in the article suggest that the NZ (a country I have no strong opinions about) politicians (not being used derogatorily) would rather the service just be removed, even going as far in some cases to call into questions it’s usefulness to the public.

While the people involved might have many reasons to dislike Mark Zuckerberg (a person I don’t care about in any profound way) or Facebook (a company I don’t care about in any profound way), the comment in the title was contextually and subtextually framed in such a way as to suggest it was made regarding Facebook’s lack of movement regarding the existence of a live-streaming service.

Final Edit:

I made this comment in response to the 80-90% of the comments in this thread, including highly voted comments, suggesting the belief that the article was about the sharing of private user information. That isn’t a subject of this article, which seems to suggest those people, and likely those upvoting, never looked at the article.

If you’ve got issues with my interpretation, that’s cool. We disagree. I’m not accusing people who disagree with that interpretation of not having read the article.

We can all agree that the article isn’t about the sharing of private information.

1ngebot on April 8th, 2019 at 01:24 UTC »

Not to defense Facebook for the other shitty things it does, but in the NZ shooting case, there was nothing they could do. For the long time of over an hour the original video was up, thousands of people watched the video but not one reported it to Facebook. Eventually someone did over an hour later. It was quickly taken down after it was reported. I assume many reporters and others didn't report this or deliberately misconstrued what happened in order to blame the big corporation Facebook instead of having to deal with the fact there are many many shitty random people on the internet, and there is no way to condemn them or even know who they are.

Edit: a word

Further edit: please continue down the thread for link and clarification of details.

Further further edit: thanks for the gold kind stranger!

Douglasracer on April 7th, 2019 at 23:36 UTC »

Yep. Pretty much hit the nail on the head.