Court Revives Legal Case Against TikTok

Court Revives Legal Case Against TikTok

A court has revived a lawsuit against TikTok related to the death of a ten-year-old who attempted a dangerous social media challenge.

At a Glance

  • A U.S. appeals court revived a lawsuit filed by the mother of a 10-year-old Pennsylvania girl who died after attempting a viral TikTok challenge.
  • The challenge involved choking oneself until losing consciousness.
  • Federal law generally protects online publishers from liability for user-posted content, but the court suggested TikTok could be liable for promoting or algorithmically steering such content to children.
  • Judge Patty Shwartz stated that TikTok engages in its own first-party speech by recommending and promoting content.
  • TikTok’s parent company, ByteDance, did not immediately comment.

Revival of TikTok Lawsuit Over Child’s Death

A U.S. appeals court has breathed new life into a lawsuit against TikTok, following the tragic death of a 10-year-old girl in Pennsylvania. In one of the more alarming instances of viral social media challenges going wrong, Nylah Anderson attempted a “blackout challenge” that involved choking oneself until passing out. Her mother found her unresponsive, leading to her death five days later.

The court revived the lawsuit initially filed by Tawainna Anderson in 2022. Mothers and families of victims aim to end such perilous social media challenges. Anderson alleges wrongful death and negligence, attributing her daughter’s exposure to dangerous content to TikTok’s algorithm, which placed the “blackout challenge” on Nylah’s “For You” feed.

Section 230 of the Communications Decency Act

Originally dismissed under Section 230 of the 1996 Communications Decency Act—which offers immunity to online platforms from liability for user-generated content—the lawsuit is now back on track. Judge Patty Shwartz remarked that TikTok is involved in first-party speech by recommending and promoting content. The appeals court partially reversed the dismissal, opting for the case to go to trial in a lower court.

“TikTok makes choices about the content recommended and promoted to specific users, and by doing so, is engaged in its own first-party speech,” Judge Shwartz wrote.

Judge Paul Matey’s stance amplifies the issue, saying TikTok’s algorithm crucially placed the videos on Nylah’s feed. This raises questions about algorithm-driven content promotion and its implications, especially when it comes to potentially harmful challenges appearing before children.

Broader Implications and Future Debates

This particular case is part of a broader scrutiny affecting social media platforms about their impact on children’s health. Nationwide, U.S. state attorneys general are investigating TikTok and other companies for contributing to mental and physical health issues in young people. Companies face multiple lawsuits alleging that their platforms addict users and cause harm.

Lawyers argue whether Congress, when enacting Section 230, could have anticipated platforms like TikTok employing algorithms to churn out recommended content. The outcomes of this and similar cases could redefine how Section 230 is applied, potentially leading to reconsiderations in how algorithms are managed and what content is served to users, particularly minors.

Sources

  1. US appeals court revives a lawsuit against TikTok over 10-year-old’s ‘blackout challenge’ death
  2. Mother whose child died in TikTok challenge urges U.S. court to revive lawsuit