Close This website uses modern features that are not supported by your browser. Click here for more information.
Please upgrade to a modern browser to view this website properly. Google Chrome Mozilla Firefox Opera Safari
your legal news hub
Sub Menu
Search

Search

Filter
Filter
Filter
A A A

Kenya Facebook lawsuit cites mass PTS diagnoses

Publish date: 06 January 2025
Issue Number: 1107
Diary: IBA Legalbrief Africa
Category: Litigation

More than 140 Facebook content moderators have been diagnosed with severe post-traumatic stress disorder caused by exposure to graphic social media content including murders, suicides, child sexual abuse and terrorism. The Guardian reports that the moderators worked eight- to 10-hour days at a facility in Kenya for a company contracted by the social media firm and were found to have PTSD, generalised anxiety disorder and major depressive disorder, by Dr Ian Kanyanya, the head of mental health services at Kenyatta National hospital in Nairobi. The mass diagnoses have been made as part of lawsuit being brought against Facebook’s parent company, Meta, and Samasource Kenya, an outsourcing company that carried out content moderation for Meta using workers from across Africa. The images and videos including necrophilia, bestiality and self-harm caused some moderators to faint, vomit, scream and run away from their desks, the filings allege. The case is shedding light on the human cost of the boom in social media use in recent years that has required more and more moderation, often in some of the poorest parts of the world, to protect users from the worst material that some people post. The moderators from Kenya and other African countries were tasked from 2019 to 2023 with checking posts emanating from Africa and in their own languages but were paid eight times less than their counterparts in the US, according to the claim documents. Almost 190 moderators are bringing the multi-pronged claim that includes allegations of intentional infliction of mental harm, unfair employment practices, human trafficking and modern slavery and unlawful redundancy. Meta said it took the support of content reviewers seriously. Contracts with third-party moderators of content on Facebook and Instagram detailed expectations about counselling, training and round-the-clock onsite support and access to private healthcare.

Full report in The Guardian

We use cookies to give you a personalised experience that suits your online behaviour on our websites. Otherwise, you may click here to learn more, or learn how to block or disable cookies. Disabling cookies might cause you to experience difficulties on our website as some functionality relies on cookie information. You can change your mind at any time by visiting “Cookie Preferences”. Any personal data about you will be used as described in our Privacy Policy.