Facebook Moderators turn to Sex and Drugs to cope with Traumatic Content

Facebookmoderator susing drugs and having sex at work

People post on Facebook all the time – photos, videos, statuses and so on. However, do we find ourselves ever thinking about those who actually have to check this content that gets uploaded every day? Sure, your neighbour may have just uploaded an innocent photo of taking her dog on a walk, but unfortunately not all uploaded content will have the same take. In fact, many of Facebook’s moderators have to endure material that is either disturbing, violent or nauseating. 

While not all content may reach this level, a report claims that many of Facebook’s content moderators are being exposed to so much traumatic material, it has even reached the point where employees are smoking cannabis at work to cope with it. Others have resorted to “trauma bonding”, which involves having sex on the job.

All of this, could to some people also sound like unbelievable excuses, but around 15,000 people worldwide are employed by Facebook to check content. However, the conditions they work in are hidden by third-party contracts and non-disclosure agreements, according to an investigation by technology news site The Verge.

The site conducted a number of interviews with current and former employees of the third-party company Cognizant, reporting that Facebook’s content moderators were experiencing traumatic conditions as well as suffering panic attacks after watching videos of murders, terrorism and child abuse. Well, just how do these types of illegal videos make their way onto Facebook in the first place?

In the US, content moderators make just over 10% ($28,000/£21,000) of what the average Facebook employee earns ($240,000/£18,000) annually. However, the report claims they do not have access to suitable mental health support, nor is any available to staff once they leave the job.

Beyond the photos of family and friends, lurks a world of disturbing content on Facebook. Credit: The Atlantic

It’s also been reported that employees can be fired for making “just a handful of errors a week” while those who remain in the job are afraid of former colleagues returning to seek revenge – with one current worker bringing a gun to work to protect himself.

However, Facebook has said there has been “misunderstandings and accusations” about its content review practices, and claim it is committed to working with partner companies to make sure employees get a high level of support.

But according to the investigation, some of those who have left Cognizant are experiencing symptoms similar to post-traumatic stress disorder. Also, those who remain at the company often tell dark jokes about killing themselves in order to cope with the material they have to view, as well as smoking cannabis during their breaks.

Others have reportedly been found engaging in sexual intercourse inside stairwells and in a room for used for breastfeeding mothers, in what’s been described as “trauma bonding”.

Some of these employees have told The Verge that they began sympathising with the viewpoints of some of the videos they have watched, including about the Earth being flat or that 9/11 was the product of a conspiracy rather than a terror attack.

The UK government are currently preparing to make social networks liable for the content on its platform, so reviewing content will likely become more common. MPs recently called for a code of ethics to assure social media sites remove harmful content, describing Facebook as “digital gangsters” in a parliamentary report.

“Social media companies cannot hide behind the claim of being merely a ‘platform’ and maintain that they have no responsibility themselves in regulating the content of their sites.” The committee wrote.

There doesn’t seem to be any sex and drugs going on in this office at least. Credit: FacebookNewsroom

In response to the investigation, Facebook’s former head of security Alex Stamos tweeted to agree that the “human cost” of reviewing content was a direct result of these complaints that the company take responsibility for moderating the platform’s content.

Facebook’s founder Mark Zuckerberg has frequently responded to criticism of content on the platform by stating he will hire extra staff to moderate content.

“We know there are a lot of questions, misunderstandings and accusations around Facebook’s content review practices – including how we as a company care for and compensate the people behind this important work” a blog post from Facebook’s Newsroom writes.

“We are committed to working with our partners to demand a high level of support for their employees; that’s our responsibility and we take it seriously.”

The blog also added: “Given the size at which we operate and how quickly we’ve grown over the past couple of years, we will inevitably encounter issues we need to address on an ongoing basis.”

Facebook also stated mechanisms were in place at the companies it works with to make sure any concerns reported are “never the norm.”

These will include contracts that guarantee “good facilities, wellness breaks for employees, and resiliency support”, as well as site visits where staff hold “1:1 conversations and focus groups with content reviewers”.

But, one now wonders just how this type of disturbing content can even be uploaded onto the social media platform, in order for it to be regulated in the first place.

 

Story by Emily Clark

Featured Photo Credit: Starspost

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.