- Free Case Evaluation: (423) 634-0871 Tap Here To Call Us
Landmark Verdict in Social Media Addiction Litigation Signals Shift in Tech Liability
In a bellwether case, a Los Angeles jury found that Meta, Inc. (Facebook, Instagram) and Google (YouTube) were negligent in the design or operation of their social media platforms. The jury awarded $3 million in compensatory damages, with Meta liable for 70% and Google liable for 30% of the damages. The jury imposed a total of $3 million in punitive damages ($2.1 million against Meta and $900,000 against Google). This was the first social media addiction trial to reach a verdict.
Essentially, social media addiction is the constant need to engage with social platforms despite consequences such as self-confidence issues, depression, anxiety, and other potentially long-term mental health effects. The constant use of social media, and the validation of a user’s interaction with the platform (such as likes, etc.), creates a release of dopamine, in many ways mimicking the use of painkillers.[1]
In a Master Complaint consisting of 300 pages of allegations, over 1,600 Plaintiffs alleged a total of thirteen causes of action against multiple defendants who designed and operated Facebook, Instagram, Snapchat, TikTok, and YouTube.[2] The Product Liability claims were dismissed because (i) the Defendants’ platforms are neither tangible products nor analogous to tangible products; (ii) the Defendants’ platforms were not suitable for analyzing liability under California’s Product Liability standard;[3] and (iii) because the Defendants’ liability should be determined by their conduct. Plaintiffs’ negligence causes survived (as did their claim for fraudulent concealment against Meta only).
Section 230 of the Communications Decency Act (1996) has long provided near-absolute immunity to social media platforms by shielding them from liability for third-party user content. Thus, Plaintiffs are foreclosed from bringing suits against social media companies for harms stemming from content posted on their platforms. Historically this has created a daunting barrier to recovering against these companies. Here, though, Plaintiffs set forth a new and different legal theory that did not implicate the content on the platform, but rather they sued over the design of the platform(s) and the Defendants’ inclusion of various features was negligent. For example
- The “continuous scrolling” feature “makes it hard for users to disengage from the app,”
[1] https://www.businessinsider.com/facebook-has-been-deliberately-designed-to-mimic-addictive-painkillers-2018-12
[2] Snapchat settled with Plaintiffs a week before trial commenced and TikTok settled on the day jury selection commenced. The terms of both settlements were confidential.
[3] For instance, the “consumer expectation test” would not be applicable because a “grandmother” might expect the platform to function differently than a business would, or serve different functions than a “young man” or a “child” would. P 34 of Order on Defendants’ Demurrer to Master Complaint and Three Short Form Complaints, *gov.uscourts.cand.414822.55.1.pdf
· The IVR algorithms deprived users of sleep by sending push notifications at night prompting children to reengage with the app rather than sleeping;
· Appearance altering tools (“filters”) provided by Defendants promote unhealthy “body image issues;”
· “Rewards” implemented by Defendants keep users checking the social media platform in ways that contribute to feelings of social pressure and anxiety.
Because Defendants allegedly negligently crafted and implemented these and many other similar features, Plaintiffs were not treating the Defendants as a “publisher or speaker of any information provided by another information content provider” and thus their claims were not barred by Section 230. (47 U.S.C. § 230(c)(1)). In other words, Plaintiffs sufficiently alleged that Defendants were liable for their own actions, not for the content of third-party postings.
The jury found that both Meta and Google were negligent in designing or operating their platforms, that their negligence was a “substantial factor” in harming the plaintiff, and that both failed to adequately warn users about the dangers of using their respective platforms and awarded $6 million to a single Plaintiff.
The implications of this verdict are likely to be far reaching and the success of this case should not be understated. There are thousands of similar social media addiction cases pending in courts throughout the country currently. This case was the first direct test of whether a jury would hold a major platform liable for design-based claims related to child addiction.
Social media litigation is at a pivotal inflection point. The rise of social media has created a new frontier for litigation, forcing courts to adapt traditional legal doctrines to digital platforms. While one may reasonably disagree with the finding that these platforms are not “products” under a product liability standard, that these companies can be held to account for knowingly putting children at risk of harm for the sake of their own profit is a watershed decision.
With billions of users worldwide, legal disputes involving social media platforms are on the precipice of reshaping fundamental questions about the role of technology in public life and the duties technology companies owe to their users. This case, and those that follow, will reshape the trajectory of tech litigation for the foreseeable future.








