Meta Faces Allegations of Inflicting Lifelong Trauma on Kenyan Content Moderators

Campaigners allege that Meta Platforms has caused ‘lifelong trauma’ to content moderators in Kenya, with over 140 diagnosed with PTSD. These claims arise from psychological assessments revealing severe mental health conditions resulting from exposure to graphic content. Legal actions involve both Meta and its former contractor, Samasource, raising concerns about the welfare of moderators and the responsibilities of tech giants.

Campaigners have accused Meta Platforms, Inc., the parent company of Facebook, of causing what they describe as “potentially lifelong trauma” to content moderators in Kenya. Recent diagnoses reveal that over 140 of these moderators suffer from Post-Traumatic Stress Disorder (PTSD) and other mental health issues. These diagnoses were provided by Dr. Ian Kanyanya, head of mental health services at Kenyatta National Hospital in Nairobi, and were submitted to the employment and labor relations court in December 2023, amid ongoing litigation linked to Meta and its outsourcing partner, Samasource Kenya.

Content moderators are tasked with filtering out disturbing content from various platforms; however, the psychological toll of this work has generated considerable criticism, especially as many moderators are managed by third-party firms in less affluent nations. Critics argue that moderation work can lead to severe mental health repercussions. Legal representatives filing the claims contend that Meta has a duty to ensure proper support and mental health care for moderators, as per the terms set with their contractors.

Despite the gravity of the situation, Meta refrained from commenting on specific medical findings due to the pending lawsuit, maintaining that it prioritizes the welfare of its moderators and complies with the expectations surrounding counselor availability, training, and remuneration. Furthermore, they indicated that moderators have options to adapt their content review settings to mitigate exposure to distressing images. On the other hand, Samasource did not respond to requests for comment.

Dr. Kanyanya emphasized that the moderators endure exposure to extremely graphic materials daily, including instances of violence, self-harm, and abuse, among other disturbing visuals. Reports indicated that from the 144 moderators who volunteered for psychological evaluations, 81% were diagnosed with severe PTSD.

The current lawsuit snowballed from a previous action commenced by a former moderator who claimed they were unjustly terminated by Samasource after advocating for better working conditions. In 2022, all 260 moderators at the Nairobi hub were dismissed, allegedly as a consequence for raising concerns over their remuneration and working conditions. The moderators in the current legal action were employed by Samasource from 2019 through 2023. Some moderators have described harrowing personal experiences, including recurring nightmares and psychological distress related to the graphic content they reviewed.

Martha Dark, co-executive director of Foxglove, characterized the moderating role as “dangerous, even deadly,” and stated, “Facebook is responsible for the potentially lifelong trauma of hundreds of people, usually young people who have only just finished their education.” Dark further asserted that similar cases in other industries would result in accountability for those responsible for such abuses.

Content moderation has emerged as a contentious issue across multiple social media platforms, with previous cases also highlighting the mental health implications of performing such tasks. A former TikTok content moderator engaged in legal action against the platform in 2021, citing emotional trauma stemming from her job, followed by additional lawsuits filed in subsequent years.

The issue of mental health impacts among content moderators has gained increasing attention, especially as social media companies rely heavily on third-party firms for content moderation tasks. These moderators are often exposed to graphic and disturbing material as part of their responsibilities, leading to chronic psychological stress and trauma. The case of Kenyan content moderators reflects broader concerns regarding working conditions and mental health support for individuals in this field. Previous legal actions in other jurisdictions further underscore the urgent need for industry standards and accountability regarding the care and treatment of content moderation workers.

In conclusion, the allegations against Meta Platforms regarding the mental health of content moderators in Kenya reveal a critical issue within the social media industry. With over 140 diagnosed cases of PTSD among these workers, and legal actions highlighting the questionable treatment by third-party contractors, it is evident that substantial improvements in support and working conditions are essential. As advocacy groups continue to push for justice and accountability, this case may serve as a catalyst for necessary change within the sector, ensuring that content moderators receive the respect, care, and support they deserve.

Original Source: www.cnn.com

About Maya Chowdhury

Maya Chowdhury is an established journalist and author renowned for her feature stories that highlight human interest topics. A graduate of New York University, she has worked with numerous publications, from lifestyle magazines to serious news organizations. Maya's empathetic approach to journalism has allowed her to connect deeply with her subjects, portraying their experiences with authenticity and depth, which resonates with a wide audience.

View all posts by Maya Chowdhury →

Leave a Reply

Your email address will not be published. Required fields are marked *