Facebook blocks and bans users for sharing Guardian article showing Aboriginal men in chains

Authored by theguardian.com and submitted by wambatu
image for Facebook blocks and bans users for sharing Guardian article showing Aboriginal men in chains

Social media site incorrectly removed historical photo on grounds of nudity, then for three days blocked and even banned users who posted link to article

Facebook has blocked and in some cases banned users who tried to share a Guardian article about the site incorrectly blocking an image of Aboriginal men in chains.

On Saturday, Guardian Australia reported that Facebook had apologised for incorrectly preventing an Australian user from sharing the photo from the 1890s.

The post was made in the context of the Australian prime minister, Scott Morrison, claiming there was no slavery in Australia, before he backed down on those comments a day later.

Facebook incorrectly removes picture of Aboriginal men in chains because of 'nudity' Read more

The post was removed by Facebook and the man had his account restricted, with Facebook claiming the photo contained nudity and was in breach of the social media site’s community standards.

The post was restored after Guardian Australia asked Facebook whether the photo had been flagged in error. Facebook apologised to the user late on Friday and restored the post.

A spokeswoman for Facebook said the photo was removed by the automated system in error.

“We apologise for this mistake,” she said.

However, dozens of Guardian readers have since reported that when they tried to post a link to the article on their profiles they received message that the post violated the same community standards.

The man who first posted the image to his profile was among those unable to share the news article.

Several readers were even banned from posting on Facebook for up to 30 days for attempting to share the article.

Hacklock - The Console Cowboy of Cyberspace (@hacklocked) I just tried to post this article, it immediately blocked the post, gave me no chance to appeal or dispute, case closed and put my account on restricted use for 24hours. If you look at my “history of not following standards.” It’s just this incident shown twice https://t.co/CzweUNCwL6 pic.twitter.com/lGmQzSBAGd

After further inquiries from Guardian Australia Facebook appeared to allow the article to be posted on Monday afternoon.

Facebook typically allows users to request a review of any takedowns. Between January and March it restored 613,000 pieces of content after 2.1m requests for review.

However, users reported that they were told the ban on the historical image might not be reviewed because “we have fewer reviewers at the moment because of the coronavirus (Covid-19) outbreak [and] we are trying to prioritise reviewing content with the most potential for harm”.

Facebook says it doesn’t need news stories for its business and won’t pay to share them in Australia Read more

Several people pointed out the disparity between Facebook quickly (and incorrectly) censoring content it believed to contain nudity and its reluctance to take action against Donald Trump’s inflammatory posts.

The journalist and activist Cory Doctorow said on Twitter the incident showed Facebook could not moderate at scale, and that automated moderating was not evenly distributed, with some minority groups more likely to have their discussions censored.

Cory Doctorow #BLM (@doctorow) And the fallout from this overblocking is not evenly distributed. Not only are some disfavored minorities (sex workers, queer people, people of color) more likely to have their discussions censored.

Doctorow argued Facebook should not be given additional duties in censoring content, as some have called for in recent weeks over Trump’s posts on Twitter and Facebook. Instead, he said, Facebook should be cut down to a “size, to a scale where communities can set and enforce norms”.

“Because the problem with [Facebook] isn’t merely that [CEO] Mark Zuckerberg is uniquely unsuited to making decisions about the social lives and political discourse of 2.6 billion people … it’s that NO ONE is capable of doing that job. That job should not exist.”

Richie4422 on June 15th, 2020 at 10:33 UTC »

That's just Facebook "moderation". If you have any experience with publishers or you know any social media managers working with publishers, they will all tell you how FB's moderation is fucked up. Especially when it comes to "nudity".

Personal experience - few months ago shared article about movie The Hunt. Thumbnail was gagged up Emma Roberts, wearing shirt and a vest.

Ooops, flagged for nudity. Nevermind, gonna refute it. Probably an accident.

25 min later; human mod decided to uphold the claim for depiction of nudity.

This is reoccurring experience. It's not Facebook being malicious, it's just Facebook being completely shit.

TipTop9903 on June 15th, 2020 at 09:58 UTC »

Cory Doctorow's point in the article is incredibly important. Facebook, and presumably all big social media, simply cannot handle the moderation required of having so many users. No companies should have this much power.

"The problem with [Facebook] isn’t merely that [CEO] Mark Zuckerberg is uniquely unsuited to making decisions about the social lives and political discourse of 2.6 billion people … it’s that NO ONE is capable of doing that job. That job should not exist.”

Centralredditfan on June 15th, 2020 at 08:49 UTC »

Why do Americans, and by extension advertisers have such a problem with nudity?