NEW YORK: TikTok’s 10,000 content moderators are exposed to a steady regimen of child pornography, rape, beheading and animal mutilation, according to a lawsuit against the video-sharing platform and its parent company, ByteDance Inc.
It’s getting worse. Content moderator Candie Frazier said in her class action proposal that she screened videos involving abnormal cannibalism, crushed heads, school shootings, suicides and even a fatal fall from a building, with audio .
And there’s no escaping it, says Frazier. TikTok is forcing moderators to work at a breakneck pace, watching hundreds of videos per 12-hour shift with just one hour off for lunch and two 15-minute breaks, according to the lawsuit filed Thursday in federal court in Los Angeles. .
âDue to the sheer volume of content, content moderators are not allowed longer than 25 seconds per video and simultaneously view three to ten videos at the same time,â his lawyers said in his complaint.
TikTok was a member of a group of social media companies, including Facebook and YouTube, that developed guidelines to help moderators deal with child abuse images their jobs demanded to see, according to the complaint. But TikTok has failed to implement the guidelines, which include providing psychological support and limiting work shifts to four hours, according to the lawsuit.
Frazier, who lives in Las Vegas, said she suffers from post-traumatic stress disorder due to all the disturbing videos she has had to watch.
“The complainant has trouble sleeping and when she sleeps she has horrible nightmares,” according to the complaint.
Frazier, who seeks to represent other TikTok content filters, is seeking compensation for psychological damage and a court order requiring the company to create a medical fund for moderators. TikTok did not immediately respond after normal business hours to a request for comment. ;