Graphic videos showing children being abused remain on Facebook despite numerous requests to have them removed, an undercover film has suggested.
Moderators also do not remove posts that violate hate speech and routinely ignore posts from children who may be under-age.
The allegations are made in a Channel 4 Dispatches documentary.
Facebook said mistakes had been made and the staff involved had been “retrained”.
“We haven’t seen the footage but have seen the transcripts. And there is quite a lot of it that is against our policies and we are investigating,” a spokeswoman told the BBC.
“We have retrained the trainers at the company involved,” she added, saying that all trainers in outsourced moderation centres around the world would be similarly retrained. She declined to say how many such centres Facebook used.
In the film, to be broadcast later on Tuesday, an undercover reporter goes to work as a content moderator in Facebook’s largest centre, in Dublin.
The work there is outsourced to a company called CPL Resources, which has worked with Facebook since 2010.
During his training, the reporter is shown a video of a man punching and stamping on a toddler.
The video has been online since 2012 and is used as an example of something that would be left up on the site and marked “disturbing content”.
One moderator tells the reporter: “If you start censoring too much, then people lose interest in the platform.
“It’s all about making money at the end of the day.”
Nicci Astin, a campaigner against child abuse, told the BBC’s Today programme that she said had asked Facebook to remove this particular video in 2012 but had been told it did not violate its terms and conditions.
“There are lots of graphic videos of children being hurt on Facebook. And there is no need for them to be on there,” she said.
Of the particular video featured on the documentary, Ms Astin said it “was still online despite being reported many, many times”.
According to Facebook, the original video was removed but it had been re-edited and re-shared repeatedly since then.
Facebook entertainment
The undercover reporter is also told not to delete a video showing two teenage girls fighting, despite the fact that both girls are clearly identifiable and the video has been shared more than a thousand times.
The mother of one of the girls involved told Dispatches that it should never have become “Facebook entertainment”.
“To wake up the next day and find out that literally the whole world is watching must have been horrifying,” she said.
“It was humiliating for her. It was devastating for her.”
Facebook’s vice-president of public policy Richard Allen said that if parents asked for such content to be taken down, it would be removed – but such videos were often posted by people wanting to make a point who would argue that the social network “should not interfere with my ability to highlight a problem”.
Ms Astin said that it “shouldn’t be up to parents” to report such things, adding that if they saw such a video of their son or daughter on the platform it would be “heartbreaking”.
‘Crack cocaine’
The programme also features venture capitalist Roger McNamee, who was one of Facebook’s earliest investors and a mentor to chief executive Mark Zuckerberg, who told the programme that the company’s business model relied on extreme content.
“From Facebook’s point of view this is… the crack cocaine of their product,” he said.
“It’s the really extreme, really dangerous form of content that attracted the most highly engaged people on the the platform.”
Mr Allen firmly denied this was the case.
“Shocking content does not make us more money – that’s just a misunderstanding of how the system works,” he said.
“People come to Facebook for a safe secure experience to share content with their family and friends.
“The vast majority of those two billion people would never dream of sharing content that, like that, to shock and offend people.
“And the vast majority of people don’t want to see it.”
Other revelations in the film include:
The undercover reporter is told that posts racially abusing immigrants are permitted
A post including a cartoon comment that describes drowning a girl if her first boyfriend is black is permitted, although Facebook later confirmed it violated its hate-speech standards
Pages belonging to jailed former English Defence League leader Tommy Robinson, who has more than 900,000 followers, are referred to be assessed directly by Facebook. Facebook confirmed to the BBC this had happened but said it had been to provide “a second pair of eyes” for politically sensitive content
Far-right group Britain First’s pages are left up because they have a lot of followers, one moderator tells the undercover reporter
The reporter is told not to proactively take action regarding users who could be under-age, unless the user admits to being below the official joining age of 13
The film also revealed a significant backlog in dealing with content reported to Facebook as being in violation of its policies.
The company aims to assess all such content within 24 hours but the film claims that this regularly did not happen, with some still waiting to be looked at five days after being reported.
At one point, 15,000 reported posts were waiting to be dealt with. And staff claimed they could not keep up with the up to 7,000 reports the company received each day.
Facebook confirmed to the BBC that there had been a backlog in April and May but said that had since been cleared.
The company plans to double the number of moderators it employs, from 10,000 to 20,000, this year.
Source:BBC