The TikTok 'fainting challenge' has killed more than 15 children under the age of 12 in the past year and a half. Why can't users under the age of 13 be excluded?

TikTok, which is extremely popular among young people, has many users posting a variety of videos, but it has also become a problem when extremely dangerous challenge videos, such as the '
'Blackout Challenge' on TikTok Is Luring Young Kids to Death - Bloomberg
https://www.bloomberg.com/news/features/2022-11-30/is-tiktok-responsible-if-kids-die-doing-dangerous-viral-challenges
A wide variety of 'challenges' regularly pop up on TikTok, with many users posting videos of themselves trying them out. Some of these are bizarre but harmless, like the ' pee pants challenge ' where users pee in their pants or feeding a cat a raw egg to see what happens , while others, like the 'skull shattering challenge,' are outrageously dangerous.
The trend of these challenges follows on from Musical.ly , a lip-syncing app that TikTok's owner ByteDance acquired and integrated into the platform in 2018. For several years, Musical.ly accepted children under the age of 13 who had been kicked out of competing platforms, and as of 2016, many of its top users were minors.
At a public conference in 2016 , Musical.ly co-founder Alex Zhu said that what sets Musical.ly apart from other entertainment apps is the variety of 'challenges' promoted and users are encouraged to take part in. This trend was carried over to TikTok, which also integrated Musical.ly, and resonated with many young people during the 2020 COVID-19 pandemic, making TikTok challenges a global trend.
Eventually, TikTok challenges became more dangerous, such as climbing stacks of milk crates, chugging the allergy drug Benadryl , and destroying school equipment . Even deadly challenges like the skull-shattering challenge and the fainting challenge appeared. The fainting challenge involves intentionally starving the brain of oxygen through methods such as hanging, resulting in a feeling of euphoria. A Bloomberg investigation found that at least 15 children under the age of 13 have died in accidents during the fainting challenge in the past 18 months.

'Choking games' similar to the fainting challenge have long been played among young people, and a 2008
TikTok's moderation team is also cracking down on dangerous challenges, but users are avoiding direct terms like 'blackout challenge' and 'choking game' and using words that are harder to detect, such as 'flatliner' and 'space monkey.' They also seem to be deliberately misspelling words to avoid detection.
Dangerous challenges like the fainting challenge are especially dangerous for children who are unable to fully understand the risks and seriousness of the consequences. TikTok prohibits use by those under the age of 13, but in reality, many users under the age of 13 are using TikTok by lying about their age, and moderation is not keeping up. A survey conducted in the UK found that approximately half of children aged 8 to 11 watch TikTok every day.
In January 2021, 10-year-old Antonella Sicomero hanged herself with the belt of her bathrobe in Sicily, Italy, and in February, 10-year-old Arriani Arroyo hanged herself in Milwaukee, Wisconsin, USA. Both women were heavy TikTok users, and their families claim, based on testimony from siblings and friends, that they died in an accident while trying a popular 'game' on TikTok. Since then, young children have hanged themselves to death around the world, and their families suspect TikTok challenges may be involved.
Most police investigating these cases have not publicly linked TikTok to the deaths. However, police in Clarksville, Tennessee, USA, who investigated the case of Lalani Walton, who died at age 8 in July 2021, found through a phone analysis that Lalani had spent several hours watching 'fainting challenge' videos on TikTok the day before her death.
In the United States, several lawsuits have been filed against TikTok by the families of the deceased, but TikTok has denied that the content it recommended caused the accident. The company has also filed a motion to dismiss the lawsuits, arguing that it is protected by Section 230 of the Communications Decency Act , which provides that platform services are not liable for information posted by third parties.

In recent years, there has been a growing movement to require platforms like TikTok to not only censor content but also to implement protection measures that do not allow users under the age of 13 in the first place. The California Age Appropriate Design Code Act (AB-2273), passed in August 2022, requires strict child protection measures for apps and websites likely to be accessed by children under the age of 18.
This is expected to lead to the introduction of age verification systems using facial photos, making it a difficult challenge for platforms to balance privacy and child protection.
A bill requiring child protection on a large number of apps and websites has been passed, accelerating the risk of 'net user face scanning' - GIGAZINE

TikTok reportedly met with companies like Yoti and Hive , which offer software to estimate age from faces, in 2021, but ultimately decided not to implement these software. An anonymous source told Bloomberg that TikTok, which has been under scrutiny from politicians for its alleged ties to the Chinese government, feared that introducing a facial photo-based age verification system would expose it to further scrutiny.
When Bloomberg sent Hive a video of the late Arriani, who died in Milwaukee, and asked the company to use its software to estimate her age, it was able to accurately guess her age—10 years old—in just three seconds. Hive CEO Kevin Guo said, 'The technology to estimate children's ages is definitely here,' and 'Platforms that are refusing to adopt it probably don't want to know the extent of the problem.'
Related Posts:
in Education, Web Service, Posted by log1h_ik






