Madeline Batt is the Legal Fellow for the Tech Justice Law Project.

This Tech Litigation Roundup gathers and briefly analyzes notable lawsuits and court decisions across a variety of tech-and-law issues.

Jury selection began on January 27 for the first “bellwether” trial of a blockbuster proceeding, involving thousands of cases going back to 2021. The suits allege that social media platforms’ negligent design causes depression, anxiety, and other mental health harms for young people. The trial marks the first time that social media companies will have to defend themselves before a jury against claims that they harmed children.

In the same week, a major hearing in a separate but related high-profile multi-district litigation suggested a federal judge may allow lawsuits filed by school districts alleging injuries from social media addiction to proceed to trial later this year.

The case now proceeding to trial is part of what is called a “JCCP,” or judicial council coordinated proceeding, in LA County Superior Court. A JCCP is a mechanism used in California state courts to litigate many cases that involve similar issues together. Here, the JCCP brought together many cases against major social media companies, including Meta, Snap, TikTok, and Google.

The current trial was filed by a now 19-year-old identified as “K.G.M.” The court chose this case to go first, knowing the result is likely to influence settlement talks for thousands of other plaintiffs in the JCCP and beyond.

The multi-district litigation (MDL) involves similar claims related to social media addiction but is proceeding in federal court in Northern California. The first trials are scheduled to begin in June, but social media companies are still fighting to get the cases dismissed before trial. The companies argued their position before the judge in a key hearing earlier this month.

Together, the proceedings involve a wide range of plaintiff groups alleging that social media caused harm to young people. Plaintiffs include youth like K.G.M., their families, school districts, state attorneys general, municipalities, and Native American nations.

In both proceedings, social media companies have focused their arguments on issues of causation and on potential content-publisher defenses under the First Amendment and Section 230 of the Communications Act. The companies contend that plaintiffs cannot prove that youth mental health harms were caused by the social media platforms and that any alleged platform-related harm instead resulted from user-generated content on the apps––content that is generally shielded from liability by the First Amendment and Section 230. Plaintiffs have countered that the platforms’ own content-agnostic design features lead to compulsive social media use, which they argue is the primary cause of the alleged mental health harms.

The legal protections the companies are invoking have historically been interpreted broadly, providing strong protections that have largely insulated social media platforms from liability. Section 230, which specifically protects online platforms publishing third-party content, has played a central role, enabling social media companies to avoid lawsuits. But with the plaintiffs in both proceedings focusing on platform design, rather than platform content, social media companies have struggled to apply their usual defenses to allegations concerning addictive, psychologically harmful product features. In the JCCP, Judge Carolyn B. Kuhl denied defendants’ motions for summary judgment against one of the plaintiffs, which relied heavily on prior defenses under Section 230. A ruling on similar motions in the MDL remains pending.

The first bellwether case of the JCCP was brought by a 19-year-old California woman, identified only by her initials K.G.M., who alleges that she has been addicted to social media for more than a decade. She claims that her compulsive use of the platforms has resulted in anxiety, depression, body dysmorphia, and suicidal thoughts. Because the case’s outcome will be seen as potentially indicative of how future social media addiction cases may fare, it carries significant financial implications for all parties considering settlement, with potentially billions of dollars at stake overall. The case could also result in changes to social media platforms’ policies and product design, shaping the future of child safety online.

Judge Kuhl rejected the defendants’ arguments that the First Amendment and Section 230 precluded K.G.M.’s case as a matter of law. But, as she wrote in her decision allowing the case to go to trial, the social media companies “certainly may argue to the jury that K.G.M.’s injuries were caused by content she viewed [and Section 230 therefore bars liability].” The defendants are therefore expected to reiterate their causation arguments at trial, seeking to persuade the jury that the facts do not show that their platforms’ features (as opposed to third-party content) caused K.G.M.’s psychological injuries.

Our Content delivered to your inbox.

Join our newsletter on issues and ideas at the intersection of tech & democracy

Thank you!

You have successfully joined our subscriber list.

On the plaintiff’s side, K.G.M. will have the chance to demonstrate to a jury how the platforms’ negligently designed features and failure to warn users of the associated risks caused her compulsive, harmful relationship with social media. The features at issue include engagement-maximizing tools such as infinite scroll and personalized recommendation algorithms. While the social media companies argued that these features should be excluded from consideration because of their role in presenting users content in their dismissal arguments, Judge Kuhl was unconvinced, writing, “[T]he fact that a design feature like ‘infinite scroll’ impelled a user to continue to consume content that proved harmful does not mean that there can be no liability for harm arising from the design feature itself.”

Witness testimony will feature prominently in the trial. The plaintiffs will rely heavily on expert witnesses, whose testimony the defendants unsuccessfully sought to exclude, to support the claim that features such as personalized algorithms and infinite scroll cause social media addiction and mental health harms. The companies were also unsuccessful in preventing top tech executives from being called to testify. Meta CEO Mark Zuckerberg is expected to testify on February 9 and to be cross-examined regarding allegations that his platforms have caused harm to young people.

As the trial date approached, Snap and TikTok announced last-minute settlements. Both companies remain defendants in all of the JCCP cases other than K.G.M.’s.

A January 26 hearing in the MDL indicated that social media giants may soon face a jury in the federal proceeding, too.

The nearly six-hour hearing focused on motions for summary judgment in lawsuits brought by school districts that allege that social media addiction has profoundly disrupted their schools. The districts are seeking reimbursement for the resources spent responding to social media addiction and injunctive relief to abate the harmful effects of students’ social media use. During the hearing, social media companies argued that Section 230 shielded them from liability and that the districts could not establish causation. But Judge Yvonne Gonzalez Rogers indicated that she thought there was enough evidence for the case to go to trial. She noted that, despite the defendants’ insistence that plaintiffs lack admissible evidence of causation, the record includes internal corporate documents showing that social media platforms cause harm. “This is the classic material dispute of fact,” the judge told defense counsel, suggesting that a jury, not a judge, should be responsible for deciding the dispute.

In contrast to the JCCP, where school districts’ lawsuits were dismissed in part because they alleged indirect economic harm, Judge Gonzalez Rogers also appeared skeptical of the social media companies’ attempts to distinguish platforms’ student users from the school districts that educate them. The school districts have alleged that the social media companies failed to warn them about the dangers of their platforms; the companies responded at the hearing that they had no duty to warn the schools. Even if the evidence showed that the companies specifically targeted children to encourage social media use during school hours, the companies argued, any duty to warn applied only to the student users. To Judge Gonzalez Rogers, this was a “distinction without a difference.” She told defense counsel, “There is no school without a student. That’s called a building.”

While Judge Gonzalez Rogers has yet to issue a decision on the motions for summary judgment, her skepticism at the hearing put social media companies on the back foot as they seek to avoid federal trials. If the school districts’ lawsuits do move forward, the first bellwether trial will begin on June 15. Like K.G.M.’s trial this month, the MDL bellwether trial would be historic: it would be the first federal case on social media harms heard before a jury and the first opportunity for a school district to show jurors how youth education has been upended by social media.

Grok NCII Fallout: The backlash against xAI’s chatbot, Grok, for generating sexualized images of real women and posting them online includes a lawsuit by Ashley St. Clair, the mother of one of Elon Musk’s children. Regulators around the world have also promised investigation and potential legal action.Character AI: Character.AI agreed to settle lawsuits over the chatbot app’s harm to minors. The settlement included Garcia v. Character Technologies, the first-ever lawsuit to allege that a chatbot caused a user’s suicide (TJLP is co-counsel on the Garcia case). The Kentucky Attorney General also sued the chatbot company last month, accusing it of “preying on children.”AI in hiring: A proposed class action targets AI company Eightfold for allegedly circumventing the Fair Credit Reporting Act (FCRA) and other consumer protection laws by scraping data on job seekers to score their “suitability” for a position. The complaint argues that compiling masses of potentially unreliable information to rate applicants is precisely the conduct that the FCRA is designed to regulate, and Eightfold has violated the law by not meeting FCRA obligations.More antitrust allegations for Google adtech: Several publishers, including those behindthe Atlantic, Rolling Stone, the Verge, and Billboard, filed antitrust lawsuits arguing that Google’s adtech monopoly deprived them of earnings. The lawsuits seek to capitalize on an early 2025 ruling that Google violated antitrust law via its adtech business.SCOTUS to weigh in on Cisco human rights case: The Supreme Court agreed to hear a case against tech company Cisco for allegedly assisting the Chinese government’s surveillance and persecution of a religious minority in violation of international law. The Ninth Circuit had permitted the lawsuit to move forward. The Supreme Court’s decision will have significant implications for US tech companies that profit by providing digital infrastructure used in connection with alleged international human rights abuses.