Ex-content moderator sues YouTube, claims job led to PTSD symptoms and depression

Authored by cnet.com and submitted by habichuelacondulce
image for Ex-content moderator sues YouTube, claims job led to PTSD symptoms and depression

A former content moderator is suing Google-owned YouTube after she allegedly developed depression and symptoms associated with post-traumatic stress disorder from repeatedly watching videos of beheadings, child abuse and other disturbing content.

"She has trouble sleeping and when she does sleep, she has horrific nightmares. She often lays awake at night trying to go to sleep, replaying videos that she has seen in her mind," says the lawsuit, which was filed in a California superior court on Monday. The former moderator also can't be in crowded places because she's afraid of mass shootings, suffers from panic attacks and has lost friends because of her anxiety. She also has trouble being around kids and is now frightened to have children, according to the lawsuit.

The proposed class-action lawsuit accuses YouTube of violating California law by failing to provide a safe workplace for content moderators and not doing enough to safeguard their mental health. Moderators spend more than four hours a day reviewing graphic video content because YouTube is "chronically understaffed," the suit says. These long hours run afoul of YouTube's best practices, according to the lawsuit. Workers are required to review "between 100 and 300 pieces of content per day with an error rate of two to five percent," creating stress and increasing the risk that content moderators develop psychological trauma from the job, according to the lawsuit.

The former moderator, who isn't named, is seeking medical treatment, compensation for the trauma she suffered and the creation of a YouTube-funded medical monitoring program that would screen, diagnose and treat content moderators.

She worked at YouTube through the staffing agency Collabera in an office in Austin, Texas, from January 2018 to August 2019. Collabera and YouTube didn't immediately respond to requests for comment.

During her time on the job, the worker saw thousands of disturbing videos that showed graphic images such as people eating from a smashed open skull, school shootings with dead children, a fox being skinned alive and a person's head getting run over by a tank, the lawsuit said. She suffered psychological trauma from the job and paid out of pocket to get treatment, according to the lawsuit.

YouTube, like other tech companies such as Facebook and Twitter, rely on both technology and humans to review posts and videos that could violate their rules against violence, hate speech and other offensive content. More contract workers are speaking out about the toll this job takes on their mental health because they're constantly exposed to graphic content.

At the same time, tech companies are under more pressure to combat hate speech and misinformation head of the US presidential election in November.

Joseph Saveri Law Firm, which also filed a 2018 lawsuit on behalf of moderators who reviewed Facebook content, is representing the former YouTube content moderator. In May, Facebook agreed to pay $52 million to content moderators as part of a settlement.

The lawsuit against YouTube alleges the company failed to adequately inform potential content moderators about the negative impact the job could have on their mental health and what it involved. Prospective moderators are told they might be required to review graphic videos but don't get more details about the job or its potential impact on their mental health.

During training, workers aren't told how to assess their reactions to graphic videos, and YouTube doesn't ease moderators into the job "through controlled exposure with a seasoned team member followed by counseling sessions," according to the lawsuit.

Content moderators are told they could step out of the room when YouTube is showing them graphic videos during training, but these workers are afraid they will lose their jobs if they do so. That's because they have to pass a test in which they have to determine whether certain content violates YouTube's rules.

YouTube also didn't do enough to provide support for these employees after they started their job, according to the lawsuit. The company allows workers to speak with wellness coaches, but the coaches don't have medical expertise and aren't available to moderators who work at night.

The ex-moderator who is suing YouTube sought the advice of a wellness coach in 2018 after she felt traumatized by a video she reviewed. The coach recommended the worker take illegal drugs and didn't provide any resilience training or ways to cope with her symptoms, according to the lawsuit. Another coach told a content moderator to just "trust in God." The Human Resources department also didn't provide content moderators with any help and YouTube requires workers to sign non-disclosure agreements, making it harder for them to talk about their problems.

Tech companies can also blur graphic images, mute audio or decrease their size to limit the negative impacts viewing offensive content can have on moderators but YouTube failed to provide these technological safeguards, according to the lawsuit.

The lawsuit alleges that YouTube is strictly liable for the harms caused to content moderators because the work is "abnormally dangerous." The lawsuit also accuses YouTube of negligent behavior and of providing "unsafe equipment," making the company responsible for the damages even though content moderators are contract workers.

If content moderators choose to leave their jobs, they'll lose their pay and health benefits.

"Content Moderators were left with a Hobbesian's choice -- quit and lose access to an income and medical insurance or continue to suffer in silence to keep their job," the lawsuit states.

Moderator complaint against YouTube by jonathan_skillings on Scribd

_melodyy_ on September 22nd, 2020 at 05:26 UTC »

When cops have to review things like child porn or really graphic violence, there are rules for how much they're allowed to watch (I believe max 2 hours a day), and there are trained mental health counsellors constantly on standby. These cops still have a hard time, which is understandable, but they have a safety net and are as well taken care of as they can be.

However, a lot of the companies whose employees review website content don't have or don't want to invest in these kinds of resources. I've read an interview with a Facebook moderator who said he worked normal office hours with very little mental health support. He said the turnover rate was immense as people kept having breakdowns, substance abuse was pretty much the norm, and everyone he worked with was showing textbook PTSD symptoms. And honestly, you can't fucking blame him, as just ONE of the videos he described was of an 11 year-old girl being anally raped.

These companies aren't like the police, because they only want to make money. Which, in and of itself, isn't a bad thing, because that's what companies do. But that means they're gonna cut costs wherever they are willing and able to, and oftentimes those cost-cutting measures go at the expense of their employees.

DickyBrucks on September 22nd, 2020 at 05:09 UTC »

I did this job for two years. I've seen things you wouldn't believe. I've watched people quit after two days from the nightmares. I've reported child porn to NCMEC. It got to the point where you breathe a sigh of relief when it's not the "bad kind" (read: brutal rape). Eastern European children setting puppies on fire and laughing. Bahrain security forces murdering protestors. CG babies eating naked women then getting indigestion and puking them out. It made me stronger, mentally, but I'd be lying if I didn't say that it didn't affect me.

BrainKatana on September 22nd, 2020 at 03:35 UTC »

I used to work on a team like this for a website in the early days of broadband internet, when higher quality gifs and video streaming were in their infancy.

We worked in shifts, 4 days on, 4 days off, 10 hours a day. There were 3 shifts per day so you were always overlapping with everyone else on the rotation for part of your shift.

The moment your shift ended you spoke to a staff mental health advisor. While you were off, you got two calls a day, usually around lunch time and after dinner time, from a mental health advisor. These advisors were on call 24-7, and they also had their own separate set of advisors.

I don’t know how YouTube’s setup worked, but I feel like it wasn’t better than the setup I had back in those days.

Our turnover rate month over month was about 80%. It’s been about 20 years and I still have nightmares.