The plaintiff Jane Doe said in her complaint that YouTube requires its content moderators to engage in an abnormally dangerous activity; that the company violates California law by failing to implement the workplace safety standards; and the company exacerbates the harm it causes content moderators by imposing nondisclosure agreements. Because of high turnover, content moderators are required to work long hours reviewing content, despite YouTube claiming in 2018 that moderators would only review up to four hours of graphic content daily.
Rather, YouTube requires moderators to review "hundreds of thousands if not millions" of potentially rule-breaking posts per week, and between June and December of 2017, moderators reviewed nearly 2 million videos for violent extremist content alone, reported Law360, which did not disclose how many moderators are reviewing content. However, the moderators are required to review "between 100 and 300 pieces of content per day with an error rate of two to five percent," creating stress and increasing the risk that content moderators develop psychological trauma from the job, according to the lawsuit. It went on: [Plaintiff Jane Doe] has trouble sleeping and when she does sleep, she has horrific nightmares. She often lays awake at night trying to go to sleep, replaying videos that she has seen in her mind…she can't be in crowded places because she's afraid of mass shootings, suffers from panic attacks and has lost friends because of her anxiety. She also has trouble being around kids and is now frightened to have children.”
Some of those videos showed graphic images such as people eating from a smashed open skull, school shootings with dead children, a fox being skinned alive and a person's head getting run over by a tank, said the lawsuit (more below). She suffered psychological trauma from the job and paid out of pocket to get treatment.
Prospective moderators are informed that they may be required to review graphic videos – no mention about their mental health.
While training, workers aren’t asked to assess their reactions to graphic videos, and YouTube doesn't ease moderators into the job "through controlled exposure with a seasoned team member followed by counseling sessions," according to the lawsuit. Instead, content moderators can step out of the room during graphic videos, but they are afraid of losing their jobs if they do so. Although YouTube allows sessions with wellness coaches, they don’t have medical knowledge and they don’t work nights.
According to the lawsuit, YouTube users upload 500 hours of video per minute, and the site relies on users to report inappropriate content, which is then reviewed by content moderators.
In addition to the $4.3 M, YouTube will provide content moderators with onsite and virtual counseling services by licensed clinicians, telephonic counseling and monthly peer support groups. The company has also agreed to provide onboard training and transparent job descriptions for all moderator applicants, and establish an easily accessible anonymous whistleblower hotline for moderators, while barring YouTube from enforcing non-disclosure agreements against workers who share details of their jobs to other class members, according to the August 22 motion for preliminary approval.
Jane Doe v. YouTube Inc. Time Frame
September 2020: An anonymous worker (Jane Doe) filed a lawsuit claiming Google-owned YouTube was negligent when it asked her to perform an abnormally dangerous activity, provided an unsafe online content moderation platform and was careless toward moderators' health and safety. The proposed class action states: Plaintiff Jane Doe… asserts various tort and statutory claims stemming from her work as a moderator of content posted on YouTube’s platform, as a result of which she alleges she suffered psychological harm. The complaint seeks compensatory damages and injunctive relief requiring Defendant to implement prospective safety guidelines and create a medical monitoring fund, [which would ] pay for “specialized screening, assessment, and treatment not generally given to the public at large” that will “facilitate the ongoing screening, diagnosis, and adequate treatment of Plaintiff and the class for psychological trauma... And it would do so for an indeterminate amount of time.”
The former content moderator says she suffers from anxiety, depression and symptoms associated with PTSD as a result of "unmitigated exposure to highly toxic and extremely disturbing images." Such content includes child sexual abuse, rape, torture, bestiality, beheadings, suicide and murder. According to the complaint, "From genocide in Myanmar to mass shootings in Las Vegas and Christ Church to videos of children being raped and animals being mutilated, content moderators spend hours a day making sure that disturbing content like this never appears to YouTube's users.”
CNET reported that YouTube relies on both technology and humans to review posts and videos that could violate their rules against violence, hate speech and other offensive content. More contract workers are speaking out about the toll this job takes on their mental health because they're constantly exposed to graphic content.
December 2020: YouTube asked the court to throw out the lawsuit, arguing that it was not responsible for the psychological trauma the worker experienced on the job because she worked for a third-party vendor, Collabera Inc., in Austin, Texas, not YouTube itself, according to Law360. The plaintiff worked as a content moderator from January 2018 until August 2019 from an office in Austin, Texas, and was employed solely by YouTube vendor Collabera Inc.
The video giant told U.S. District Judge Yvonne Gonzalez Rogers that “Jane Doe” was wrong to file suit in California instead of in Texas, where she lived and worked, and against YouTube instead of the vendor, which would have entitled her to no-fault workers' compensation. YouTube further declared that it had no control over the safety and security of the moderator's work, which allegedly involved exposure to severe violence, such as genocide and sexual assault. (California law offers an exception for "abnormally dangerous activities" — which includes viewing graphic content for hours on end, the plaintiff alleged — Texas does not.)
READ MORE CALIFORNIA LABOR LAW LEGAL NEWS
November 2022: Attorneys representing content moderators requested a $1.4 million chunk of the agreed $4.7 million settlement. The motion argued the attorney fees are reasonable given that the two law firms representing the YouTube workers are the first to develop the legal claims of social media content moderators with unsafe work environments. (The same law firm filed a 2018 lawsuit on behalf of Facebook moderators. Facebook agreed to pay $52 million to content moderators as part of a settlement.)
This case is Jane Doe v. YouTube Inc., case number 4:20-cv-07493, in the U.S. District Court for the Northern District of California.