BBC finds Facebook failed to remove child porn. Facebook reports BBC to authorities

Authored by wired.co.uk and submitted by Wonderful_Mood3204

A BBC investigation has revealed that 80 per cent of child abuse images it reported to Facebook were not removed. Facebook responded to the allegations by requesting the BBC send examples of the material to it, then reporting the team to the authorities for sending them.

Facebook is regularly in the headlines for some of its incongruent moderation practices - removing photos of breastfeeding mothers, but allowing violent footage of beheadings, for instance. However, in this instance, the BBC is suggesting the content and its validity on the site is black and white.

The BBC described the content as “images from groups where men were discussing swapping what appeared to be child abuse material”, including: “pages explicitly for men with a sexual interest in children; images of under-16s in highly sexualised poses, with obscene comments posted beside them; groups with names such as ‘hot xxxx schoolgirls’ containing stolen images of real children; an image that appeared to be a still from a video of child abuse, with a request below it to share ‘child pornography’.” It also claimed to have identified five convicted paedophiles with profiles on the platform, none of whom were removed after the BBC reported them.

The BBC specifically launched its investigation to test Facebook’s moderation procedures, after its 2016 investigation found the network was being used by groups of paedophiles to meet and exchange content. Of the 100 images reported by the BBC in its current investigation, using Facebook’s standard reporting tools, 18 were removed and the rest were found not to breach its terms.

Read next Covid-19 has made a basic income system an economic imperative Covid-19 has made a basic income system an economic imperative

Simon Milner, Facebook’s UK policy director, told WIRED in a statement: “We have carefully reviewed the content referred to us and have now removed all items that were illegal or against our standards. This content is no longer on our platform. We take this matter extremely seriously and we continue to improve our reporting and take-down measures. Facebook has been recognised as one of the best platforms on the internet for child safety.

Pornhub wants Twitter to face the full wrath of the Digital Economy Bill Porn Pornhub wants Twitter to face the full wrath of the Digital Economy Bill

“It is against the law for anyone to distribute images of child exploitation. When the BBC sent us such images we followed our industry’s standard practice and reported them to CEOP. We also reported the child exploitation images that had been shared on our own platform. This matter is now in the hands of the authorities.”

Although the statement appears to suggest all 100 images have been removed, this is not, in fact, what it says. Milner says all items deemed illegal or against its standards have now been removed but fails to clarify whether it considered all 100 as falling into this category, just the 18 originally removed, or a figure somewhere in between. WIRED has sent two requests to Facebook for clarification on this matter, but not heard back. We have also asked the BBC if it can confirm whether all 100 images have been removed. WIRED has not seen the material, however, if Facebook has in fact removed all 100 pieces of content originally reported to it by the BBC team, it suggests the material was illegal or in breach of its terms and calls into question the quality of its moderation procedures.

Speaking to the BBC on this point, Children's Commissioner for England Anne Longfield, said: “The moderation clearly isn't being effective, I would question whether humans are moderating this, are looking at this, and also I think it is failing to take account of the context of these images."

Read next Fix the post-Covid economy by supporting the unemployed Fix the post-Covid economy by supporting the unemployed

"We are never complacent, the reality is that new images and videos of children being sexually abused appear every day" Susie Hargreaves, Internet Watch Foundation

Facebook has been a partner of the Internet Watch Foundation since 2009, which uses hashing technology to scour sites for child abuse content to remove and report it. WIRED contacted the group to ask if it would be investigating the alleged failure of Facebook’s moderating procedures, but the IWF says it cannot comment on images it has not seen. “Social networks are one of the least likely places we find child sexual abuse content,” it did say, via a spokesperson. “Last year we just found 1 per cent of this imagery on social networking sites globally.”

CEO Susie Hargreaves later told WIRED: “We are never complacent, the reality is that new images and videos of children being sexually abused appear every day and this is why industry, government, law enforcement and the IWF all work in partnership to do whatever we can to remove these images and give these children back some of the childhood that has been stolen from them. “

The BBC reports that it had been chasing Facebook for an interview about the social network's moderating system since 2015, and was promised an interview with Milner on the proviso it send examples of the material Facebook failed to remove. The examples were sent, and the interview was then cancelled and the BBC reported to the National Crime Agency.

Read next Faculty advised officials on how to police post-Brexit fishing waters Faculty advised officials on how to police post-Brexit fishing waters

BBC's director of editorial policy, David Jordan, has publicly said: "The fact Facebook sent images that had been sent to them, that appear on their site, for their response about how Facebook deals with inappropriate images...the fact that they sent those on to the police seemed to me to be extraordinary. One can only assume that the Facebook executives were unwilling or certainly reluctant to engage in an interview or a debate about why these images are available on the Facebook site."

Chairman of the Commons media committee, Damian Collins, told the BBC: "I think it raises the question of how can users make effective complaints to Facebook about content that is disturbing, shouldn't be on the site, and have confidence that that will be acted upon."

According to recent statistics, around 300 million images are uploaded to Facebook everyday. There is no possibility of human moderators presiding over every item. The social network instead relies on algorithms to categorise much of its reported content. This has repeatedly been shown to not be 100 per cent effective, whether through a photo of a mother breastfeeding her premature daughter being banned, or an infamous war photo from the 70s depicting a naked young girl in Vietnam leading to users being blocked. In the case of the latter, after photos were repeatedly removed Norway’s prime minister posted the same image to her Facebook page in protest of censorship. The editor of Norway's biggest newspaper also accused Mark Zuckerberg of “abusing” his role as “the world’s most powerful editor”.

Facebook spent years amending its network to capitalise on being a top referral source for news publishers, working to keep that traffic on the site and transforming itself from a top exit portal for news to a top destination for it. When it has come under attack for the content on its site - most recently, in the case of fake news - it has always maintained it is a distributor and aggregator, not a publisher, and therefore not responsible for what is shared. However, it has also been pushed to take steps to weed out fake news, and in the past has taken a stance that could be described as editorial.

For instance, when attacked for allowing a video of a beheading on its site in 2013, Facebook argued the decision was because the network is a place to share "experiences, particularly when they're connected to controversial events on the ground, such as human rights abuses, acts of terrorism and other violent events". As such, "people share videos of these events on Facebook to condemn them. If they were being celebrated, or the actions in them encouraged, our approach would be different." Like a newspaper, it made a conscious decision to allow controversial material to remain on the site for the sake of debate and freedom of information. Unlike a newspaper, it said it would remove the content if it did not approve of the ensuing debate.

Like all social networks, it is impossible to categorise Facebook as one thing. It is not a pure aggregator, because it pays for content and its algorithm (built by human editors) prioritises specific items over others - just like a newspaper would edit its homepage and urge readers to pay attention to specific news items. It is also, of course, not a publisher in the traditional sense.

WIRED will update this article if we hear back from the BBC or Facebook.

ThinkOfANameHere on October 28th, 2020 at 07:25 UTC »

Oh it gets better. Facebook specifically requested these examples be sent if BBC wanted an interview with big execs. They did this to get out of that interview. Jfc.

AdvancedAdvance on October 28th, 2020 at 02:07 UTC »

To which Zuckerberg said, “Looks like I had the last jocular reaction typical of us humans.”

AvalieV on October 28th, 2020 at 01:38 UTC »

Yeppp. Facebook questioned the existence of the [underage sexual] content, so they requested examples from the BBC, and when the BBC sent reported photos that were not removed to Facebook as proof, Facebook reported them for sending the material, on their own site. Ridiculous. Got rid of Facebook in January, haven't missed it once.

Edit: words