Court rules that TikTok may not be exempt from liability in lawsuit filed by family of 'Blackout Challenge' victim



In a lawsuit filed by a mother against TikTok for the death of her 10-year-old daughter who participated in the 'fainting challenge,' in which people film themselves fainting and share the video, the appellate court overturned the first-instance ruling that dismissed the plaintiff's claim based on the immunity from liability provisions of Section 230 of the Communications Decency Act, stating that 'there may have been recommendations made by an algorithm, and therefore there are cases in which the app cannot avoid liability.'

'Blackout Challenge' lawsuit against TikTok may proceed

https://lawandcrime.com/lawsuit/tiktok-lawsuit-over-10-year-old-girl-who-died-after-blackout-challenge-reignited-after-appeals-court-ruling/

TikTok told it must face lawsuit over deadly viral challenge, despite Section 230 protections - SiliconANGLE
https://siliconangle.com/2024/08/28/tiktok-told-must-face-lawsuit-deadly-viral-challenge-despite-section-230-protections/

The daughter of Tawainna Anderson, who lives in Pennsylvania, USA, lost consciousness while performing the fainting challenge in December 2021 and died shortly after despite treatment. The fainting challenge is one of TikTok's 'trends,' and is carried out mainly by teenagers, regardless of the risks, resulting in the deaths of more than 15 children in the year and a half from 2020 to 2022.

More than 15 children under the age of 12 have died in the past year and a half due to the TikTok fainting challenge. Why can't they exclude users under the age of 13? - GIGAZINE



Anderson argued that TikTok was designed to maximize its user attention, encouraging kids to try and share trending challenges for themselves. He filed a lawsuit against TikTok, but the suit was dismissed by a federal judge.

The judge cited Section 230 of the Communications Decency Act, which states that 'in principle, providers such as TikTok are not liable for content transmitted by third parties,' and concluded that Anderson's daughter's death was her own fault, and therefore her claim was inadmissible.



Anderson appealed the first-instance ruling, and the second-instance federal appeals court overturned the first-instance ruling and stated that TikTok was likely to bear some responsibility. The appeals court judge pointed out that if TikTok had displayed the fainting challenge as a 'recommendation to users,' TikTok itself would have been considered to have engaged in expressive activity by 'recommending content,' and therefore would not have been subject to exemption from liability under Section 230 of the Communications Decency Act.

The backdrop to this point is a Supreme Court case decided just a month ago on whether state laws in Texas and Florida that limit social media's ability to control whether and how third-party content is made available to other users violate the First Amendment to the Constitution, which provides for freedom of speech and of the press.

The Supreme Court ruled that algorithms such as recommendation widgets are an expression of 'expressive activity' by aggregating third-party content for the social media platforms' own purposes, and therefore state laws restricting such activity are unconstitutional.

Because the ruling held that social media expression is a 'first-party act,' the federal appeals court hearing Anderson's case considered that 'TikTok's algorithmic video recommendations are first-party expression and may not be subject to liability under Section 230 of the Communications Decency Act, which provides immunity for third-party content distributors.'

However, this only applies to TikTok's recommendations. If a user actively searches for content, it may still be subject to the exemption from Section 230 of the Communications Decency Act.

in Note,   Web Service, Posted by log1p_kr