Facebook moderators ‘to keep child abuse online’

Reuters

Graphic videos showing victims of child abuse remain on Facebook despite numerous requests for their elimination, the covered film has suggested.

Moderators, also do not remove messages that violate the hate speech, and routinely ignore the post of children who may be minor children.

The accusations are made in a Channel 4 Dispatches documentary.

Facebook has said mistakes had been made and the staff involved had been “upgraded”.

“We have not seen the movie, but have seen the transcripts. And there is a lot of it is that it is against our policies, and we are investigating”, a spokesman has told the BBC.

“We retrained the trainers in the company involved,” he added, saying that all of the trainers in the outsourcing moderation centres around the world would be similarly retrained. He declined to say how many of these centers Facebook used.

In the film, which will be broadcast later on Tuesday, a reporter goes to work as a content moderator in Facebook and the largest centre of Dublin.

The work there is outsourced to a company called CPL Resources, which has worked with Facebook since 2010.

During its formation, the reporter is shown a video of a man punching and stamping on a child.

The video is online since 2012 and is used as an example of something that could be left on the site and marked with “strong content”.

A moderator said to the journalist: “If you start censoring too much, then people lose interest in the platform.

“It’s all for the money, at the end of the day.”

Nicci Astin, an activist against child abuse, told the BBC Today programme which said it had asked Facebook to remove this particular video in 2012, but had been told not to violate the terms and conditions.

“There are a lot of graphic videos of children being injured on Facebook. And there is no need for them to be there,” she said.

Of the particular video featured on the documentary, Ms Astin has said that “it is still online, despite being reported many, many times.”

According to Facebook, the original video was removed, but had not been re-edited and re-shared more times since then.Facebook entertainment

The reporter is also said to not delete a video that shows two girls fighting, despite the fact that both girls are clearly identifiable and that the video has been shared over a thousand times.

The mother of one of the girls involved post said that it should never have become “Facebook entertainment.”

“To wake up the next day and find out that, literally, the whole world is watching must have been horrible,” he said.

“It was humiliating for her. It was devastating for her.”

Facebook is the vice-president of public policy Richard Allen said that if the parents have asked for such content to be taken down, could be eliminated, but such videos are often posted by people who want to make a point that argue that the social network “should not interfere with the ability to highlight a problem”.

Ms Astin said that “should not be the parents to report these things, adding that if they had seen a video of their son or daughter on the platform would be “heartbreaking”.’Crack’

The program also has a venture capitalist Roger McNamee, who was one of Facebook’s early investors and a mentor to ceo Mark Zuckerberg, who has told the program that the company’s business model relied on extreme content.

“From the Facebook point of view this is… the crack of their product,” he said.

“It’s really extreme, really dangerous form of the content that attracted the most committed people on the platform.”

Mr. Allen strongly denied this was the case.

“Shocking content make us more money – this is just a misunderstanding of how the system works,” he said.

“People come to Facebook to secure the experience of sharing the content with their friends and family.

“The vast majority of the two billion people who would never dream of sharing content which, like that, to shock and offend people.

“And the vast majority of people do not want to see”.

Other revelations in the film include:The reporter said that the post racist abuse of immigrants are allowedA post including a cartoon comment that describes the drowning of a girl if his first guy is black is allowed, although Facebook later confirmed violated the hate-speech standardsPages belonging to jailed former English Defence League leader Tommy Robinson, who has more than 900,000 followers are called to be evaluated directly from Facebook. Facebook has confirmed to the BBC that had happened, but said that it was to provide “a second pair of eyes” politically sensitive contentThe far-right group Britain First pages are left because they have a lot of followers, a moderator tells the reporterThe reporter is told not to act in a proactive way with regard to users who may be minors, unless the user admits to being under the official joining age of 13 years

The film also revealed a significant delay in the processing of the content is reported to Facebook as being in violation of its policies.

The company aims to assess all of your content inside 24 hours, but the film argues that regularly does not happen, some still waiting to be looked at to five days after being reported.

At a certain point, 15,000 jobs reported were waiting to be addressed. And the staff claimed they could not keep up with the up to 7,000 reports, the company has received each day.

Facebook has confirmed to the BBC that there had been a backlog of April and May, but said that it had been deleted.

The company expects to double the number of moderators it employs from 10,000 to 20,000, this year.