close
close

Verdict in the case of the death of a 10-year-old during the “Blackout Challenge”

PHILADELPHIA, Aug. 28, 2024 (GLOBE NEWSWIRE) — A three-judge panel of the Third Circuit Court of Appeals has just ruled that TikTok is not entitled to unlimited, sweeping immunity under the highly controversial Section 230 of the federal Communications Decency Act. The ruling and a scathing concurring opinion were issued yesterday in the case of Estate of Nylah Anderson v. TikTok, Inc.. et al., Case No. 22-3061 (on appeal from the U.S. District Court for the Eastern District of Pennsylvania in Case No. 2:22-cv-01849).

Ten-year-old Nylah Anderson from a suburb of Philadelphia died of suffocation at home on December 12, 2021, while trying toBlackout Challenge,” that she had seen on her TikTok “For You” page. Tawainna Anderson, Nylah’s mother, represented by Saltz Mongeluzzi Bendesky, filed a lawsuit in 2022 to hold TikTok accountable for its dangerous actions that caused her daughter’s preventable death. TikTok defended the claims by arguing that they were completely immune from the harm they caused under Section 230 of the Communications Decency Act. Yesterday, the appeals court definitively ruled that TikTok was not entitled to such protection.

Ms Anderson's lawyers said the Third Circuit's ruling “affirms that the court's doors remain open to Tawainna Anderson in her tireless pursuit of justice on behalf of her late daughter.”

Nylah Anderson/Family photo

Jeffrey Goodman, a partner at Saltz Mongeluzzi Bendesky who argued on behalf of the Anderson family in the Third Circuit, claimed: “The big technology companies just lost their get-out-of-jail-free card.”

This ruling ensures that the powerful social media companies must abide by the same rules as all other corporations. And if they unscrupulously harm children, they will be brought to justice.

In his concurring opinion, Judge Paul Matey stated, in part: “Nylah, who was still in her first year of adolescence, probably had no idea what she was doing or that following the images on her screen would kill her. But TikTok knew Nylah would watch because the company's customized algorithm placed the videos on her 'For You' page.”

While the Anderson family continues to mourn Nylah's death, they welcomed the court's decision and said through their lawyer: “Nothing will bring our beautiful little girl back. But we take comfort in knowing that by holding TikTok accountable, we can help other families avoid future, unimaginable suffering. Social media companies must use their technology to prevent young children from consuming dangerous content; they must stop exploiting children in the name of profit.”

Samuel Dordick, a partner at Saltz Mongeluzzi Bendesky who worked with Mr Goodman, added: “For decades, Big Tech companies like TikTok have used Section 230 to shield themselves from accountability for their egregious and predatory conduct. This sweeping ruling has made it crystal clear that Section 230 does not extend that far.”

NOTE: All interview requests should be directed to Saltz Mongeluzzi Bendesky. The court's opinion and court records can be found on the firm's website: www.smbb.com.

Contacts:
Jeffrey P. Goodman / 215-840-6450 / [email protected]
Steph Rosenfeld / 215-514-4101 / [email protected]

A photo accompanying this announcement is available at: