The traumatic lives of Facebook moderators revealed

The traumatic lives of Facebook moderators revealed

Content moderators employed by Facebook face significant psychological trauma as they grapple with disturbing online content. This work often leads to stress and severe mental health issues, prompting some to take sick leave due to post-traumatic stress.

Telus, a subcontractor working for Meta (formerly Facebook), has seen approximately 20% of its content moderation workforce on sick leave. The daily exposure to disturbing content results in cumulative psychological damage, impacting the moderators' well-being.

Moderators have criticized the company's support systems and have found them inadequate for their challenging roles. While psychologists are available around the clock, their assistance primarily entails discussions about the distressing content. Following these sessions, moderators are expected to return to work. The strict working hours, unrelenting oversight, and the absence of outlets for relief have heightened moderators' stress levels. The pressure and a lack of emotional release mechanisms have made it increasingly difficult for them to cope.

Moderators undergo a disturbing desensitization process to the distressing content. These traumatic experiences lead to phases of shock, desensitization, and a cyclical process of madness. Sleep disturbances, anxiety attacks, and even suicidal thoughts have afflicted moderators. The constant exposure to explicit content has profoundly affected moderators' sexual and emotional lives, leading to troubling shifts in their thoughts and behaviors. This experience blurs the boundary between reality and the online world.

The toll extends to their personal relationships and overall mental health. Daily exposure to distressing content has left many with a bleak view of humanity. For some moderators, the impact extends to shifts in thoughts and behaviors, negatively affecting their sexual and emotional lives and leading to disturbing thought patterns.

The rigid working conditions, coupled with the pressure to meet quotas, have left moderators feeling like mere numbers to the company, with little consideration for their well-being. Content moderators must navigate Facebook's often intricate content policies, which require them to make nuanced distinctions between various types of content. This task becomes more challenging for individuals already grappling with disturbing material.

The severe psychological toll and relentless nature of content moderation make it imperative to prioritize mental health support, comprehensive training, and improved working conditions. The mental health challenges of these moderators must be taken seriously, and the industry needs to address these issues promptly.

Content moderators play a pivotal role on the internet, making it crucial for companies like Meta to provide enhanced support and safeguards while the industry prioritizes the mental well-being of these crucial workers.