According to the 233-page lawsuit filed in Oakland, California federal court October 24, Meta designed and deployed harmful and psychologically manipulative product features to induce young users’ compulsive and extended Platform use, while falsely assuring the public that its features were safe and suitable for young users. Further, Meta has developed and refined a set of these features to maximize young users’ time spent on its Social Media Platforms.” The complaint says that "Meta has harnessed powerful and unprecedented technologies to entice, engage, and ultimately ensnare youth and teens. Its motive is profit, and in seeking to maximize its financial gains, Meta has repeatedly misled the public about the substantial dangers of its social media platforms…It has concealed the ways in which these platforms exploit and manipulate its most vulnerable consumers: teenagers and children."
Meta even went so far as to release a guide for parents in September 2018, urging them to allow their children to join Instagram, lest the children risk “social marginalization.” Meta even publicly acknowledged that focusing on Instagram users under age 13 is part of its business strategy. Then, in 2021, Meta backtracked when a spokesperson testified to Congress that Meta’s Platforms are not designed for children 12 and under.
Federal regulation stipulates that social media companies ban users under 13 from signing up to their platforms. But kids know how to skirt bans even without their parents’ consent and they know how to create social media accounts. TikTok, for instance, has a default 60-minute time limit for users under 18 but it just takes a passcode to keep watching. (TikTok and other social media platforms are also facing hundreds of lawsuits filed on behalf of children and school districts about social media’s addictiveness.) The states’ lawsuit argues that Meta collected data on children without informing and getting permission from their parents and by doing so, violated this regulation and the Children's Online Privacy Protection Act.
Reuters reported that younger Meta consumers may help secure more advertisers who hope children will keep buying their products as they grow up. But the states said research has associated children's use of Meta's social media platforms with "depression, anxiety, insomnia, interference with education and daily life, and many other negative outcomes." Meta said it was "disappointed" in the lawsuit. "Instead of working productively with companies across the industry to create clear, age-appropriate standards for the many apps teens use, the attorneys general have chosen this path," the company said.
Mark Zuckerberg posted in October 2021 on his Facebook page, "At the heart of these accusations is this idea that we prioritize profit over safety and well-being. That's just not true.”
The lawsuit was filed after an investigation—led by a bipartisan coalition of attorneys general from California and several other states—was fueled by several newspaper reports and initially the Wall Street Journal (aptly titled the Facebook Files) in late 2021. The WSJ reviewed internal Facebook documents, including research reports, online employee discussions and drafts of presentations to senior management, and wrote that “Facebook’s platforms are riddled with flaws that cause harm, often in ways only the company fully understands.”
READ MORE CALIFORNIA LABOR LAW LEGAL NEWS
In February 2023, on behalf of more than 100 families, lawyers filed a master complaint accusing social media firms including Meta, Snapchat, Google and TikTok’s parent company, ByteDance, of harming young people with their products. Bloomberg reports that the 300-page complaint was filed in a multi-district litigation case that consolidated the dozens of product liability lawsuits against Meta Platforms Inc., Google LLC, BytDance Ltd., and Snap Inc. It alleges that social media products were created that caused a youth mental health crisis, encouraged dangerous “challenges” and facilitated sex trafficking.