A BBC documentary has unveiled the working conditions and duties leading, and resulting psychological trauma, of social media moderators.
The report covered the professional experiences of Shawn Speaglem who was employed as a Facebook content moderator for a third party company Cognizant, headquartered in Florida in the United States. Despite having signed a non-disclosure agreement, Shawn spoke out on the pictures and images that workers have to review as part of Facebook’s moderation policies and processes.
He stated: “One of my first videos that I remember looking at was two teenagers grabbing an iguana by the tail and they smashed it onto the pavement while a third person was recording it. And the iguana was screaming and the kids just would not stop until the iguana was just pasted on the ground. I’ve seen people put fireworks in a dog’s mouth and duct tape it shut. I’ve seen cannibalism videos, I’ve seen terrorism propaganda videos.”
Shawn informed the documentary producers that he has experienced great stress, weight gain and depression due to the content he had to view as part of his expected duties. He stated: “I felt like I was a zombie in my seat. It really gets to you because I don’t have that bystander syndrome where I’m OK just watching this suffering and not contributing any way to deter it.”
In Ireland, where the European Union headquarters of many social media platforms are based a legal action is currently being formulated in relation to the working condition of a number of moderators. Facebook has faced legal employment actions previously. In September 2018 Selena Scola, a former content moderator with the company filed a legal action against the company in relation to the mental effects of the work. She argued that the viewing of disturbing images and videos lead to her contracting Post Traumatic Stress Disorder (PTSD) during the time that she was working at the Facebook headquarters in California. After she submitted her case two more former Facebook content moderators issued similar claims and, due to this, Facebook may now face a class-action lawsuit in relation to this issue.
Continual and repeated viewing of harmful content is an unfortunate part of the as part a Moderators role. These side effects can lead to psychological injury and traumatic mental suffering to the Moderators over time.
Such traumatic suffering can have a great impact depending on the actual content seen, the provision and availability of proper support mechanisms from employers to help staff to deal with work-related trauma and work targets. The latter refers to the level of work and output that must be completed each day. It is the obligation (duty of care) of the employer to ensure that they run a safe place of work, a safe system of work and to prevent harm to their staff.