The content moderation system exploits vulnerable workers in developing countries, subjecting them to severe psychological trauma while paying them a fraction of what US moderators earn, all while denying them adequate mental health support and attempting to evade responsibility through outsourcing. All Big Tech companies should wake up and address the human rights violations taking place along their value chains.
Meta takes the support of content reviewers seriously. It provides counseling, training, and round-the-clock support through third-party contracts. It ensures its employees are paid above local industry standards and implements technical measures like blurring and muting to limit exposure to graphic content. Though it recognizes this is challenging work for data annotators, Meta has always followed Kenyan law and upheld its ethical and wellness standards.