Home » Health » Facebook Moderators Suffer Lifelong Trauma, PTSD Diagnosed in 140+

Facebook Moderators Suffer Lifelong Trauma, PTSD Diagnosed in 140+

Meta Faces​ Lawsuit: Kenyan Moderators Allege ‍Severe⁤ PTSD from‍ Graphic Content

A important legal battle is brewing ⁤against meta, the parent company of facebook.Over 140 Kenyan content moderators are ​suing‍ the tech giant, alleging severe psychological⁤ trauma stemming from their work reviewing graphic content. The lawsuit, filed in Nairobi,⁣ claims that‌ prolonged ⁤exposure to violent and disturbing material led to widespread ‍diagnoses of⁣ Post-Traumatic stress Disorder (PTSD),⁣ anxiety, ⁣and depression.

The moderators, employed by Samasource kenya (now Sama), a third-party contractor for Meta, were tasked with filtering harmful content from the Facebook platform. Their work,​ according to medical reports filed with the court, involved daily‍ exposure to “extremely graphic content,” as described by Dr. Ian Kanyanya, head of mental health services at⁢ Kenyatta National Hospital. Dr. Kanyanya’s assessment revealed that 81% of the 144 moderators who volunteered for psychological evaluations suffered from​ “severe” PTSD.

“Extremely graphic content on⁢ a daily basis which included videos of gruesome murders,⁢ self-harm, suicides, attempted suicides, ⁣sexual violence, explicit sexual ⁤content, child physical and sexual‌ abuse, horrific violent actions just to name a few,” Dr. Kanyanya stated in his report. The sheer volume and intensity of this material, the lawsuit argues, caused significant and lasting mental health damage.

Facebook Moderators Suffer Lifelong Trauma, PTSD Diagnosed in 140+
Lawyer Mercy Mutemi, from Nzili and Sumbi Associates, ⁤alongside fellow counsel during a pre-trial ⁣consultation with Meta’s ⁢legal counsel and a judge on April ​12, 2023. – Tony Karumba/AFP/Getty Images

The lawsuit, filed⁤ on December 4th, ‍2023, is a class action representing ⁤185 moderators. It highlights concerns about ⁣the ethical implications ‌of outsourcing such ⁣emotionally‍ taxing work to⁣ developing countries, frequently⁣ enough with less stringent worker protections. while ⁢Meta declined to comment directly on the medical⁣ reports due to‌ the ongoing litigation, thay stated that they take‌ the well-being of their moderators seriously and that their contracts with third-party firms include​ provisions for counseling, training, and fair compensation. they also noted that moderators have ⁣tools to customize their content review experience, such as blurring or desaturating graphic images.

This case raises critical questions ‍about corporate responsibility and the mental health impacts of‍ moderating⁢ online content. The high prevalence of severe PTSD among these moderators underscores the need for greater protections and support‍ for individuals performing this essential, yet emotionally demanding, work. The outcome of this lawsuit could have significant implications for the tech⁢ industry ‌and its approach to content moderation ‍globally.

Former Facebook Moderators Sue, Claiming Trauma and Unjust Dismissal

A new lawsuit filed by former Facebook content moderators‌ in Kenya alleges severe psychological trauma and unlawful dismissal after they protested unsafe working conditions. The case, supported by the UK-based non-profit Foxglove, highlights the​ devastating impact of content moderation on mental health and raises serious questions ‍about corporate responsibility.

The lawsuit, launched in 2022, centers ‍on the experiences of‌ moderators employed by Samasource Kenya between 2019 ⁢and 2023. According to Foxglove, all 260 moderators at Samasource Kenya’s Nairobi hub were⁢ made redundant last year, a move they describe as “punishment” for raising concerns⁢ about pay and working conditions.

Image related ‍to the lawsuit
Placeholder Image: Replace with ⁢relevant image.

Court documents detail the harrowing experiences of these moderators. One medical record, ‌obtained by CNN, describes a moderator waking up in “cold sweats from frequent nightmares” directly related to the graphic content ⁢they reviewed. This led to‍ “frequent breakdowns, vivid flashbacks, and paranoia.”

Another former moderator recounted developing trypophobia – a fear of clusters of small holes – after viewing⁤ an image of maggots on a decomposing hand.These ⁤accounts paint a stark picture of the psychological toll exacted by the job.

“Moderating Facebook is dangerous, even deadly, ⁣work that inflicts lifelong PTSD ⁣on ‍almost everyone who moderates⁢ it,” said Martha Dark, co-executive ⁤director of Foxglove. “In Kenya, it traumatized 100%‌ of hundreds of former ⁢moderators tested for PTSD… Facebook is responsible for the potentially lifelong trauma of hundreds of‌ people, usually young ‌people who have only just finished their education.”

dark further argued that if these diagnoses occurred in any other industry, those responsible would face “legal consequences for mass violations of people’s rights.” ​ This lawsuit is not‌ an isolated incident; similar legal actions have been filed against other⁢ social media giants.

In 2021, a TikTok content moderator ⁣sued the platform for psychological trauma, followed by⁤ another lawsuit in 2022 involving multiple former moderators.These cases underscore a growing concern about the mental health consequences faced by individuals tasked with ⁣policing online content.

This lawsuit serves as a⁤ critical​ reminder ​of the human cost behind the curated online‍ experience.‌ ⁢ It highlights the urgent need for social media companies to prioritize ⁣the well-being of their content moderators and implement robust support systems to mitigate the risks associated⁣ with this demanding work.

For more news and to subscribe to CNN newsletters, visit CNN.com.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.