Trauma Caused by Viewing Graphic Content leads to Facebook Worker Compensation Claim

50-year-old Dubliner Chris Gray has filed a legal action to the High Court is Dublin, seeking compensation for the trauma and stress he experienced during the 10 months he was employed by CPL Solutions for moderate content for the social media giant.

This action is potentially the first in a number of compensation cases that will be taken against these companies. Solicitor for Mr Gray, Diane Treanor of Coleman Legal Partners in Dublin revealed that the firm has been contacted by content moderators based in Barcelona and Berlin with an interest in joining the legal action.

Mr Chris Gray’s claim informed that he was expected to constantly view a variety of inappropriate content in order to filter out inappropriate content with a 98% accuracy rating. He revealed that the content included footage of event such as “various scenes of people dying in different accidents … set to a musical soundtrack. [Gray] had a long argument with the quality point of contact [a senior role] about whether the music meant that the person posting it was ‘celebrating’ or whether it just counted as disturbing content.”

He added that he was very traumatized and stressed as a result of the content he viewed and his required work targets. Over time he began to have difficulty sleeping and would often awaken in the night due to nightmares or worry that he filed something wrong. He commented: “It took me a year after I left to realise how much I’d been affected by the job. I don’t sleep well, I get in stupid arguments, have trouble focusing.”

Facebook, commenting on the legal action, released a statement which said: “We are committed to providing support for those that review content for Facebook as we recognise that reviewing certain types of content can sometimes be difficult. Everyone who reviews content for Facebook goes through an in-depth, multi-week training program on our Community Standards and has access to extensive psychological support to ensure their wellbeing. This includes 24/7 on-site support with trained practitioners, an on-call service, and access to private healthcare from the first day of employment. We are also employing technical solutions to limit their exposure to graphic material as much as possible. This is an important issue, and we are committed to getting this right.”

Cori Crider is a director of Foxglove, a UK-based not-for-profit group, which is supporting the legal action. She said: “The reason we’ve got involved is that we think that social media factory floors are unsafe and need to be cleared up. In a decade we’re going to look back on this as we did at meat packing plants at the turn of the century. Facebook’s only going to pay attention to things when they know that they’ve got a typhoon bearing down on them. What I’d like to see is the moderators realising how much power they have if they just organise. Because let’s face it, social media as we know it could not exist without the labour people like Chris provide.”