| 0 comments ]

TikTok Moderators Sue After Being "Traumatized" By Content

Back in July, a band of former Facebook content moderators rebelled against Zuck & Co., proclaiming that they would seek to invalidate NDAs that Facebook forces all its content moderators to sign so they don't squeal to the press about the freakshow of mayhem and debauchery that they're subjected to every day while reviewing flagged content that can include depictions of sexual abuse, violence, murder torture and mayhem (remember the Christchurch video?) and - of course - politically incorrect content and news stories, often with a conservative slant.

"No NDA can lawfully prevent us from speaking out about our working conditions," the FB workers said at the time.

While TikTok has become most closely associated with teenage wannabe prostitutes shaking their assets for views, there are other indications that the Chinese-designed app might be intentionally working to corrupt the youth of America.

As we reported, the app has already been slammed for feeding depictions of drug use, sex, porn, kinks and other topics that might unsettle parents to children as young as 13. All the while, Beijing has limited use of the Chinese version of the app to just 40 minutes a week for the youth of China.

Now, fresh off TikTok being named the most dominant social media platform of the year, it appears their content moderators have learned from their comrades at Facebook - comrades, who, lets remember, technically worked for third-party contractors whom FB hires to handle the content moderation - that they might be able to make a quick buck by suing the social media giants for psychic damage accrued while performing content moderation duties, often while working as contractors with little job security and few benefits.

To wit, the Verge reported that a TikTok content mod named Candie Frazier has filed a class-action lawsuit in the California Central District Court alleging that TikTok-owner ByteDance and its contractors "failed to meet industry standards intended to mitigate the harms of content moderation. These include offering moderators more frequent breaks, psychological support, and technical safeguards like blurring or reducing the resolution of videos. TikTok and its contractors closely monitor the time moderators spend moderating videos, effectively forcing workers to keep their eyes on an overwhelming orgy of debauchery for long hours with few breaks.

This has led to workers being "traumatized" by the content they're supposed to be moderating, according to the lawsuit.

In a proposed class-action lawsuit filed in the California Central District Court, Candie Frazier says she spent 12 hours a day moderating videos uploaded to TikTok for a third-party contracting firm named Telus International. In that time, Frazier says she witnessed “thousands of acts of extreme and graphic violence,” including mass shootings, child rape, animal mutilation, cannibalism, gang murder, and genocide.

Frazier says that in order to deal with the huge volume of content uploaded to TikTok daily, she and her fellow moderators had to watch between three and ten videos simultaneously, with new videos loaded in at least every 25 seconds. Moderators are only allowed to take one 15 minute break in the first four hours of their shift, and then additional 15 minute breaks every two hours afterwards. The lawsuit says ByteDance monitors performance closely and “heavily punishes any time taken away from watching graphic videos.”

[…]

As a result of her work, Frazier says she has suffered “severe psychological trauma including depression and symptoms associated with anxiety and PTSD.” The lawsuit says Frazier has “trouble sleeping and when she does sleep, she has horrific nightmares. She often lays awake at night trying to go to sleep, replaying videos that she has seen in her mind. She has severe and debilitating panic attacks.”

Frazier claims in her suit that she has screened videos involving freakish cannibalism, crushed heads, school shootings, suicides, and even a fatal fall from a building, complete with audio.

Content moderators are critical to helping some of the world's most profitable companies continue to stay in business.

Frazier's lawsuit was filed by the Cali-based Joseph Saveri Law Firm, which previously filed a similar lawsuit back in 2018 against Facebook on behalf of moderators. That case resulted in a $52M settlement paid by the social media giant. So, it looks like Frazier has picked well.

Tyler Durden Sat, 12/25/2021 - 20:00
https://ift.tt/3qIfaMX
from ZeroHedge News https://ift.tt/3qIfaMX
via IFTTT

TikTok Moderators Sue After Being "Traumatized" By Content SocialTwist Tell-a-Friend

0 comments

Post a Comment