YouTube will not allow content that “realistically simulates” deceased children or victims of violent crimes.
The video platform has updated its harassment and cyberbullying policy to ban content that “realistically simulates deceased minors or victims of deadly or well-documented major violent events describing their death or violence experienced.”
The updated policy will come into force on January 16th, Google, which owns YouTube, said.
It is part of its wider “responsibility efforts,” which previously included banning content about Covid-19 vaccines that contradicted consensus from health authorities and employing an algorithm to apply age restrictions to relevant videos.
The policy change comes after the proliferation of true crime videos on platforms like TikTok and YouTube recreating the likeness of dead or missing children narrating the stories of what happened to them.
Some of the videos used AI-generated depictions of James Bulger, a British two-year-old abducted and murdered in 1993, and Madeleine McCann, a three-year-old British girl who disappeared in Portugal in 2007.
Bulger’s mother described the AI clips of her son as “disgusting” and “beyond sick,” according to The Mirror.
TikTok started removing such videos afterwards, stating that “our Community Guidelines are clear that we do not allow synthetic media that contains the likeness of a young person,” but many remained on YouTube.
Last year, YouTube started requiring creators to label “synthetic” content and warned that failing to do so could result in suspension or penalties. It also said creators and artists will be able to take down content that simulates their likeness without consent.
Your email address will not be published. Required fields are markedmarked