Emerging Litigation Podcast
Emerging Litigation PodcastProduced by HB Litigation
Interviews with leading attorneys and other subject matter experts on new twists in the law and how the law is responding to new twists in the world.

Interested in contributing an article? Email us at Editor@LitigationConferences.com.

 Subway Surfing Suit Against Meta and TikTok: Setting the Stage for Social Media Liability 

The Legal Battle Over Algorithmic Targeting and Dangerous Challenges 

By Tom Hagy*

I will confirm it, not that I will be the first. Without much prompting, teenage boys will do really stupid, really high-risk things. Or, in the minds of teenage boys, really exciting, really entertaining things.  

I have stories I can only share now that my parents are no longer alive. I regularly surfed on top of a moving station wagon that my other friend was driving speedily up a pedestrian pathway. I rode my bike at top speed down a hill into a ditch, hoping it would swallow and lock my front wheel, just to see how far over the handlebars I would fly. On my first day ever on skis I jetted straight down an icy hill meant for experts, going as fast as I could – a trip cut short by a collision with a half-ton snow gun that snapped my ski in half. In each case, I suffered not a scratch. At least that I can remember. And I wasn’t one to journal.  

Blame it on an undeveloped prefrontal cortex and/or the sweet rush of adrenaline. Either way, the outcome was the same. But with prompting, I would do even more. Now I think, what if I could have shared my feats with thousands or potentially millions of people around the world? Many of them girls! “Has anyone strapped themselves to a rocket recently?” I would have wondered.  

Thanks to social media, that is what teenagers have at their disposal today. And that is what inspired a New York teenager to attempt what would end up being his last shot at social media attention and his last act. The lawsuit brought by his mother recently withstood a motion to dismiss, but with some paring. Below are details of that case, the arguments made by the parties, and some notes on other similar cases.  

The Subway Surfing Case  

On June 27, 2025, the New York County Supreme Court issued a decision in Nazario v. ByteDance Ltd., allowing a wrongful death lawsuit to proceed against social media giants Meta (Instagram) and TikTok’s parent company ByteDance. The case centers on the tragic death of 15-year-old Zackery Nazario, who died while “subway surfing” in Brooklyn—a risky act allegedly fueled by dangerous viral content promoted through algorithmic targeting on these platforms. 

Justice Paul A. Goetz’s decision rejected the defendants’ attempt to dismiss the case on several grounds, signaling a potential shift in the landscape of platform liability. Norma Nazario, Zackery’s mother, contends that Meta and TikTok intentionally designed their products to addict young users and promoted hazardous “subway surfing” challenges to Zackery, encouraging participation in deadly trends. The court found these claims plausible, especially given allegations that the companies’ algorithms actively pushed such content to minors who had not even solicited it, potentially exceeding the protections afforded by Section 230 of the Communications Decency Act. 

The ruling held that Section 230 immunity, often a powerful shield for social media companies, does not necessarily cover claims where platforms are alleged to have played an active role in targeting and promoting dangerous content. The court cited recent precedents, including Anderson v. TikTok, Inc. (3rd Cir. 2024), which recognized that algorithmic amplification of harmful challenges could fall outside Section 230’s scope. 

Additionally, the court ruled that the plaintiff’s claims for strict product liability (design defect and failure to warn), negligence, wrongful death, and loss of services could proceed. Claims against the Metropolitan Transportation Authority and New York City Transit Authority were dismissed, with the judge finding that Zackery’s own conduct was the superseding cause and that transit agencies had no further duty to warn of the obvious dangers of subway surfing. 

Part of a Broader Legal Trend 

While the Nazario case has garnered attention for its focus on algorithmic promotion of dangerous challenges, it is not an isolated event. Across the United States and internationally, social media platforms face increasing litigation over their role in allegedly encouraging minors to participate in risky, and sometimes deadly, viral trends. 

Other high-profile cases include lawsuits surrounding the so-called “Blackout Challenge,” where children lost their lives or suffered severe injury after participating in asphyxiation games promoted or popularized on social media. In Anderson v. TikTok, Inc., the Third Circuit Court of Appeals allowed claims to proceed, drawing similar lines between passive hosting of content and active algorithmic promotion to vulnerable users.  

Families in multiple jurisdictions have brought wrongful death and product liability actions against platforms for challenges ranging from dangerous stunts to pharmaceutical misuse, with courts increasingly scrutinizing the underlying design and function of recommendation algorithms. 

Many cases, like Nazario’s, allege that companies not only failed to curb the spread of hazardous content but also engineered their platforms to maximize engagement among minors, amplifying the reach and appeal of dangerous trends. Plaintiffs have also pointed to internal documents and whistleblower revelations that, they claim, demonstrate the companies’ awareness of the risks to young users. 

Arguments in the Motion to Dismiss and Plaintiffs’ Response 

In their motion to dismiss, Meta and TikTok argued: 

  • Section 230 Immunity: The companies argued that the Communications Decency Act shields them from liability because the content at issue was user-generated, and platforms are protected from claims arising from third-party content.
  • No Duty Owed Under Product Liability: They asserted that social media platforms are not “products” under New York law and thus not subject to strict product liability theories.
  • First Amendment Protection: The defendants claimed the suit sought to regulate protected speech, which would violate their First Amendment rights.
  • Dismissal of All Claims: They requested dismissal of all tort and statutory claims, including those for personal injury, unjust enrichment, and emotional distress.

The plaintiffs countered:

  • Active Role and Algorithmic Promotion: Plaintiffs argued that the platforms went beyond mere hosting and played an active, targeted role by using algorithms to push dangerous content to minors. This, they contended, was not protected by Section 230.
  • Products Liability Applies: They asserted that, under New York law, digital platforms can constitute “products” when their design and function foreseeably lead to user harm. The court agreed, noting the plausibility of this legal theory.
  • First Amendment Not Absolute: Plaintiffs maintained that the lawsuit sought to regulate tortious conduct and negligent product design, not protected speech. The court concurred, holding that whether the companies’ actions were protected or actionable would require further factual development.
  • Survival of Key Claims: While some claims (including personal injury and certain statutory violations) were dismissed, the core allegations of strict product liability, negligence, wrongful death, and loss of services survived, ensuring the main case would proceed to discovery.

Zooming Out 

As I’ve said, there have been numerous lawsuits against social media companies alleging that their platforms encourage or amplify dangerous behavior, especially among children and teens. Here are some of the most notable examples and trends:

  1. Social Media Addiction and Harm MDL
  • Thousands of lawsuits have been filed against Meta (Facebook/Instagram), TikTok, Snapchat, YouTube, and Discord, alleging that these platforms are intentionally designed to be addictive to children and teens, causing mental health issues, self-harm, eating disorders, and even suicide.
  • These cases are consolidated in a multi-district litigation (MDL) in the Northern District of California, with bellwether trials set to begin in 2026.
  • Plaintiffs include families of affected children and school districts, arguing that platforms “prioritize engagement and growth over user safety” and “amplify dangerous content that prioritizes engagement over safety.”
  1. Viral Challenge Lawsuits
  • Subway Surfing Challenge (discussed in this article): The wrongful death lawsuit against Meta and TikTok for promoting subway surfing is part of a broader trend. Multiple lawsuits have been filed after children died or were injured participating in dangerous stunts seen on social media.
  • Blackout Challenge: TikTok faces lawsuits after children died participating in the “blackout challenge,” which encourages users to choke themselves until they lose consciousness. A federal appeals court recently ruled that TikTok can be sued for allegedly promoting this challenge through its algorithm.
  • Other Challenges: Lawsuits have also been filed over the “hot water challenge,” “Benadryl challenge,” and other dangerous viral trends.
  1. Product Liability Theory
  • Plaintiffs are increasingly using product liability claims to bypass Section 230 immunity, arguing that social media platforms are “defective products” because their design and algorithms addict users and promote harmful behavior.
  • Courts have begun to allow some claims to proceed, especially those focused on platform design rather than specific user content.
  1. Government and School District Lawsuits
  • Attorneys general in over 40 states have sued Meta, TikTok, and other platforms, alleging they misled the public about the dangers of social media and failed to protect children.
  • School districts are suing social media companies for contributing to the youth mental health crisis and increased disciplinary problems.
  1. Notable Cases and Rulings
  • Snapchat Speed Filter Case: A federal appeals court found that Snapchat could not invoke Section 230 to shield itself from a lawsuit claiming its speedometer filter encouraged speeding, resulting in a fatal crash.
  • Buffalo Shooting Case: The estates of victims of the 2022 Buffalo supermarket shooting sued social media platforms, arguing that their algorithms radicalized the shooter.

Conclusion 

The Subway Surfing case not only highlights the tragic consequences of dangerous viral challenges but also signals a shift in judicial willingness to scrutinize the design and operation of social media platforms. As courts allow more claims to proceed past the motion to dismiss stage, discovery will shed further light on algorithmic targeting and the responsibilities of tech companies to protect vulnerable users. The outcome of Nazario v. ByteDance Ltd., alongside similar cases nationwide, will start to shape the future of online safety, platform liability, and regulatory frameworks for years to come.