Facebook moderator describes the horrors she had to take down - Business Insider

Monika Bickert, Facebook's head of product policy. Getty


A former Facebook moderator described to the BBC the horrors she was exposed to on the job and criticized the social network for not doing enough to support staff members handling disturbing imagery.

The content reviewer, who worked in a Facebook center in Berlin, spoke to the BBC on the condition of anonymity.

She told the BBC that she would have seconds to decide whether to remove some disturbing photos and videos. Among the worst images, she said, were beheadings, animal abuse, terrorist attacks, and child pornography.

The woman suggested that the work affected her mental health, describing a vivid nightmare she had during her time at Facebook:

"I had nightmares a couple of times. I remember one, for example: people jumping from a building. I don't know why. And I remember people, instead of helping the people jumping, they were just taking photos and videos ... I woke up crying."

She accused Facebook of not providing enough support to content reviewers and said staff members regularly complained to management.

"It's the most important job in Facebook, and it's the worst, and no one cares about it," she said.

In a message directed at Facebook CEO Mark Zuckerberg, she added: "How are you allowing this to happen? That young people like us are having to see these things — we're treated like nothing."

Facebook did not immediately respond to Business Insider's request for comment.

Monika Bickert, Facebook's head of product policy, acknowledged to the BBC that Facebook moderating was difficult work and said support systems were in place for employees.

"This work is hard, but I will say that the graphic content, that sort of content, is a small fraction of what reviewers might see," she said. "Increasingly, we've been able to use technology to review and remove some of the worst content."

Bickert added: "We're committed to giving them what they need to do this job well. If they're ever uncomfortable at work, there are counseling resources for them, and they can be shifted to work on a different type of content."

Facebook this week published the internal guidelines its moderators use. The document, which is 8,500 words long, goes into detail about what is and isn't allowed, including its policies about sexual or violent content and hate speech.

Facebook is increasingly relying on artificial intelligence to identify offending items on its site. But Zuckerberg said on Wednesday that it was "easier to build an AI system to detect a nipple than what is hate speech."

More: Facebook Mark Zuckeberg
https://www.businessinsider.com/facebook-content-reviewer-describes-horrors-took-down-2018-4