×
Wednesday, May 6, 2026

'You can't unsee it': the content moderators taking on Facebook - Financial Times

By his own estimate, Trevin Brownie has seen more than 1,000 people being beheaded.

In his job, he had to watch a new Facebook video roughly every 55 seconds, he says, removing and categorising the most harmful and graphic content. On his first day, he recalls vomiting in revulsion after watching a video of a man killing himself in front of his three-year-old child.

After that things got worse. “You get child pornography, you get bestiality, necrophilia, harm against humans, harm against animals, rapings,” he says, his voice shaking. “You don’t see that on Facebook as a user. It is my job as a moderator to make sure you don’t see it.”

After a while, he says, the ceaseless horrors begin to affect the moderator in unexpected ways. “You get to a point, after you’ve seen 100 beheadings, when you actually start hoping that the next one becomes more gruesome. It’s a type of addiction.”

Brownie is one of several hundred young people, most in their 20s, who were recruited by Sama, a San Francisco-based outsourcing company, to work in its Nairobi hub moderating Facebook content.

A South African, he is now part of a group of 184 petitioners in a lawsuit against both Sama and Facebook owner Meta for alleged human rights violations and wrongful termination of contracts.

The case is one of the largest of its kind anywhere in the world, but one of three being pursued against Meta in Kenya. Together, they have potentially global implications for the employment conditions of a hidden...



Read Full Story: https://news.google.com/rss/articles/CBMiP2h0dHBzOi8vd3d3LmZ0LmNvbS9jb250ZW50...