On Tuesday, the United States District Court for the Eastern District of Pennsylvania ruled in favor of TikTok, citing Section 230 in their ruling which shields the social media platform from liability in the case of a young girl who died after participating in the “Blackout Challenge.”
In December of 2021, ten-year-old Nylah Anderson attempted the viral Blackout Challenge which had been spreading across the social media platform.
The Blackout Challenge appeared on the young girl’s “for you page,” a landing page on the app which uses an algorithm that “learns her age, location, and her previous application use” to show videos to the user they would most likely be interested in.
Nylah Anderson attempted the challenge and was found by her mother in a bedroom closet hanging from a purse strap.
“Three deep ligature marks on Nylah’s neck confirmed that she had suffered while struggling to free herself,” the court document stated.
The girl spent several days in intensive care and later died of injuries connected with the challenge attempt.
Her mother, Tawainna Anderson, brought a lawsuit against the company, alleging that TikTok is liable for her daughter’s death.
“Although the circumstances here are tragic, I am compelled to rule that because Plaintiff seeks to hold Defendants liable as’ publishers’ of third-party content, they are immune under the Communications Decency Act,” Judge Paul S Diamond wrote.
Anderson claimed TikTok has a “defective” design, and that the app failed to warn users of the dangers associated with the challenge.
“Anderson urges that she seeks to hold Defendants directly liable for their own acts and omissions as designers, manufacturers, and sellers of a defective product, not for their conduct as publishers,” said Diamond.
Diamond wrote that Anderson based her allegations on “Defendants’ presentation of ‘dangerous and deadly videos' created by third parties and uploaded by TikTok users.”
Anderson alleged that TikTok and its algorithm “‘recommend inappropriate, dangerous, and deadly videos to users’; are designed ‘to addict users and manipulate them into participating in dangerous and deadly challenges’; are ‘not equipped, programmed with, or developed with the necessary safeguards required to prevent the circulation of dangerous and deadly videos’; and ‘[f]ail to warn users of the risks associated with dangerous and deadly videos and challenges.’”
Diamond said that Anderson bases her allegations in that TikTok “published a third party’s dangerous content.”
Section 230 states in part, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider,” and “No cause of action may be brought and no liability may be imposed under any State or local law that is inconsistent with this section.”
Since Diamond stated that Anderson was attempting to claim that TikTok had a role to play in publishing these videos, the social medial platform is protected under Section 230, and their motion to dismiss is granted.
“Nylah Anderson’s death was caused by her attempt to take up the ‘Blackout Challenge.' Defendants did not create the Challenge; rather, they made it readily available on their site. Defendants’ algorithm was a way to bring the Challenge to the attention of those likely to be most interested in it,” Diamond wrote.
“In thus promoting the work of others, Defendants published that work—exactly the activity Section 230 shields from liability. The wisdom of conferring such immunity is something properly taken up with Congress, not the courts,” he added.