Facebook To Hire Additional Content Moderators As Number Of Objectionable Videos Rises

The chief executive officer of Facebook, Mark Zuckerberg, has promised that the social media giant will increase the number of content reviewers by 3,000. This will help the social media platform to get rid of objectionable content at a faster rate than is currently the case. Rather than preview content before it is accessible to the public, the social media giant depends on community policing tools and a small number of employees who review the posts that have been reported as being objectionable. This has so far proved inadequate.

“Over the next year, we’ll be adding 3,000 people to our community operations team around the world – on top of the 4,500 we have today – to review the millions of reports we get every week,” said Zuckerberg.

Livestreaming murders

In the last couple of months there has been an upsurge in the number of videos containing footage of assaults, rapes, murders and shootings on Facebook. This includes a video in which a Thai man livestreamed himself killing his daughter who was only 11 months old. As a result Facebook has received criticism for sometimes taking too long before objectionable content can be pulled down. In the case of the Thai murderer, the video was publicly available for almost 24 hours prior to being taken down. By then it had been viewed 370,000 times.

In another case, the raping of a teenage girl in Chicago was livestreamed to about 40 members of Facebook before it could be taken down. It also took about three hours prior to a video showing the murder of an elderly man in Cleveland, Ohio could be removed. Live videos are especially hard to police as viewers have no idea what will happen prior to beginning the viewing.

Post-Traumatic Stress Disorder

While announcing the intention to hire more content moderators, Facebook did not, however, say whether they would be permanent employees or independent contractors. The social media giant did not also offer details on what measures it would put in place to ensure the wellbeing of the content moderators since they are exposed to extremely horrifying content that could harm their mental and emotional health.

According to an expert in commercial content moderation and an assistant professor at the University of California Los Angeles, Sarah Roberts, turnover rates in the relatively new field is high. For those who remain, it can lead to burnout, desensitization or Post-Traumatic Stress Disorder (PSTD). Microsoft is at the moment facing a lawsuit from former employees who developed debilitating PSTD.

 

Leave a Reply

Your email address will not be published. Required fields are marked *