The popular video-based app is being sued by parents for allegedly showing images that encouraged seven children to choke themselves to death.
The so-called “blackout challenge” displayed videos of other people trying techniques that “encourage users to choke themselves with belts, purse strings, or anything similar until passing out.” All of the children who reportedly died after trying them are said to have been aged between eight and 14.
The deaths by strangulation mostly occurred last year in Italy, Australia, and the US, with the two most recent lawsuits being filed on behalf of Lalani Walton, eight, and Arriani Arroyo, nine.
The mother of one of the earlier alleged victims, Nylah Anderson of Pennsylvania, claims that TikTok “pushed exceedingly and unacceptably dangerous challenges.”
In an attempt to shore up its reputation, TikTok has blocked the blackout dare video, instead linking those who search for it to a warning screen that declares “some online challenges can be dangerous, disturbing, or even fabricated.”
This is unlikely to deter the lawsuit on behalf of Arroyo, which states that the alleged victim was not actively searching for the video but in fact had it presented to them on TikTok’s “for you” page.
The lawsuit claims that TikTok’s algorithm “specifically curated and determined that these Blackout Challenge videos – featuring users who purposefully strangulate themselves until losing consciousness – are appropriate and fitting for small children.”
Such challenges are said to be an integral part of TikTok’s platform, ranging from more benign dance dares to what appears to have been a far more dangerous and tragic undertaking.
This is not the first time the Chinese-owned platform has got it into trouble over its underage users. In 2019 TikTok agreed to pay $5.7 million for allowing children to sign up without parental consent. However, following that settlement it did introduce a “family pairing mode” that allows parents to control how much time their children spend on the app.
The Arroyo lawsuit asserts that TikTok has a “duty to monitor the videos and challenges shared, posted, or circulated on its app and platform to ensure that dangerous and deadly videos and challenges were not posted, shared, circulated, recommended, or encouraged.”
More from Cybernews:
Subscribe to our newsletter