This article contains descriptions of child sexual abuse and other acts that readers may find annoying.
“It is all about pornography,” says Sarah Katz, recalling his eight months of the season working as a Facebook moderator.
“The agency was very honest about what type of content you want to see in terms of how graphic it was, so that does not leave us in the dark.”
In 2016, Sarah was one of the hundreds of human moderators working for a third-party agency in California.
His job was to review complaints of inappropriate content, as flagged by Facebook users.
She shared her experience with BBC Radio 5 live’s Emma Barnett.
“Are the culmination of us spending about a minute for each post to decide if it is spam, and whether to remove the content,” he said.
“Sometimes we want to also delete the associated account.
“The management liked not to work more than eight hours per day, and we would like to review an average of about 8,000 posts a day, approximately 1,000 jobs per hour.
“Pretty much learn on the job, specifically on day one. If I had to describe the work in a single word, it would be ‘strenuous’.
“Definitely, you have to be prepared to see anything after a single click. You can be hit with things very quickly and without notice.
“The piece of content that sticks with me was a piece of child pornography.
“Two children – the boy was maybe about 12 and the girl about eight or nine – foot one in front of the other.
“They were not wearing pants and they were touching each other. It really seemed like an adult was probably off camera telling them what to do. It was very disturbing, especially because you could tell it was real.Reappearing posts
“Many of these explicit posts in circulation. Often you see them pop-up of about six different users in a day, which made it very difficult to find the original source.
“At the time that there was nothing in the way of advisory services. Could be today, I’m not sure.”
Sara says that she probably would have taken orientation, if it had been offered.
“Definitely warned, but the warning and the fact that they are different.
“Some people think that they can handle, and it turns out that you can not, or is actually worse than expected.”The graphic violence
“You become rather desensitised over time. I would not say that it gets easier, but you definitely do get used to it.
“It was not, obviously, a large amount of generic of pornography between consenting adults, that is not so worrying.
“There was some bestiality. There was one with a horse that is kept in circulation.
“There is a lot of graphic violence, there was when a woman had her head blown off.
“Half of his body was in the ground and the torso up was still on the chair.
“The policy was more strict on the elimination of pornography that it was because of the graphic violence.”False news
“I think that Facebook was captured by false news. In the period prior to the elections in the united states, seemed very out of the radar, at least at the time I was working there.
“I really can’t remember ever hearing the term ‘false news’.
“We saw a lot of news articles that circulated and reported by users, but I do not recall any administration asks us to examine the news articles to make sure that all data were correct.
“It is very monotonous, and really get used to what is spam and what is not. It just becomes a lot of clicking.
“I recommend? If you could do anything else, I would say no.”Facebook responds
The BBC shared the story of Sarah with Facebook.
In response, a Facebook spokesman said: “Our reviewers play a crucial role in making Facebook a safe and open environment.
“This can be very difficult to work with, and we want to make sure that you feel the right support.
“That is why we offer regular training, counselling, and psychological support to all our employees and all who work for us through our partners.
“Despite the fact that the use of artificial intelligence in which we can, there are now more than 7,000 people who review the content on Facebook, and taking care of your well-being is a real priority for us.”