Physical Address

304 North Cardinal St.
Dorchester Center, MA 02124

Appeals Court Reignites TikTok Lawsuit Over 10-Year-Old’s ‘Blackout Challenge’ Death

The mother of a 10-year-old Pennsylvania girl who tragically died after participating in a TikTok challenge has received the green light to proceed with her lawsuit against the social media giant.

In December 2021, young Nylah Anderson died after attempting the viral “Blackout Challenge,” which encourages participants to choke themselves until they pass out. According to the lawsuit filed against TikTok and its parent company ByteDance, Nylah’s mother, Tawainna Anderson, discovered her daughter hanging from a purse strap in her bedroom. Despite days in a pediatric intensive care unit, Nylah ultimately succumbed to her injuries.

Anderson’s legal action claims that the app and its algorithm are strategically designed to maximize user engagement and addiction, particularly by children, through a feedback loop of watching and sharing viral challenges and other videos.

“TikTok is programming children for the sake of corporate profits and promoting addiction,” the lawsuit asserts.

Initially, a federal judge dismissed the case in October 2022, citing Section 230 of the Communications Decency Act of 1996. This law protects content distributors from liability for third-party communications. Judge Paul S. Diamond noted that TikTok’s algorithm was designed to highlight content of potential interest, thus falling under the protection of Section 230.

However, a recent ruling by a federal appeals court has reversed this decision. This change is rooted in a recent Supreme Court decision indicating that social media networks have a First Amendment right to engage in “expressive activity,” which might lead to TikTok facing liability for Nylah’s death.

U.S. Circuit Judge Patty Shwartz explained that Section 230 protects only information provided by another party. Since Anderson’s lawsuit is based on TikTok’s recommendations via its algorithm—which is considered TikTok’s own expressive activity—it is not shielded by Section 230.

The appeals court leaned heavily on a Supreme Court unanimous decision regarding state laws attempting to counteract perceived censorship of conservative perspectives on social media. In that ruling, all nine justices found the state statutes to be fundamentally unconstitutional.

Applying this framework, the appeals court concluded that Anderson’s case should move forward.

Judge Shwartz noted that the Supreme Court had acknowledged that compiling third-party speech into a curated experience amounts to the platform’s own expressive activity. Therefore, TikTok’s algorithm does not merely distribute third-party content but engages in expressive activity.

The court pointed out a specific distinction: Nylah did not seek out the Blackout Challenge video; it was promoted to her through TikTok’s algorithm. This makes TikTok more than just a repository of content; it actively promotes specific content.

The judges clarified that Anderson’s claims not based on TikTok’s algorithm might still be barred by Section 230, but this determination would be left to a lower court.

U.S. Circuit Judge Paul Brian Matey partially concurred but also issued a strong dissent. He criticized TikTok for its “casual indifference” to the death of Nylah Anderson and argued that the company should be held accountable for knowingly distributing harmful content.

Matey argued that Section 230 has been expansively interpreted to protect content distribution platforms, even when these platforms facilitate harmful content.

He condemned the broad interpretation of Section 230, suggesting it allows companies to evade responsibility for harmful activities conducted on their platforms.

For Matey, the time for greater accountability has come. He suggested that while TikTok might prioritize profit over the common good, it cannot claim immunity from Congress if it leads to devastating harm.

Source: Law & Crime