Anderson v. TikTok Inc, No. 22-3061 (3d Cir. 2024)
Annotate this Case
A ten-year-old girl named Nylah Anderson died after attempting the "Blackout Challenge," a dangerous activity promoted in a video recommended to her by TikTok's algorithm. Her mother, Tawainna Anderson, sued TikTok and ByteDance, Inc., alleging that the companies were aware of the challenge, allowed such videos to be posted, and promoted them to minors, including Nylah, through their algorithm.
The United States District Court for the Eastern District of Pennsylvania dismissed the complaint, ruling that TikTok was immune under Section 230 of the Communications Decency Act (CDA), which protects interactive computer services from liability for content posted by third parties. The court found that TikTok's role in recommending the video fell under this immunity.
The United States Court of Appeals for the Third Circuit reviewed the case and reversed the District Court's decision in part, vacated it in part, and remanded the case. The Third Circuit held that TikTok's algorithm, which curates and recommends videos, constitutes TikTok's own expressive activity, or first-party speech. Since Section 230 of the CDA only provides immunity for third-party content, it does not protect TikTok from liability for its own recommendations. Therefore, the court concluded that Anderson's claims were not barred by Section 230, allowing the lawsuit to proceed.
Some case metadata and case summaries were written with the help of AI, which can produce inaccuracies. You should read the full case before relying on it for legal research purposes.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.