Roblox introduces a slew of new safety measures and parental controls on Monday to help prevent kids under 13 from being targeted by online predators – a known issue spotlighted by child safety advocates this past July.
The new built-in measures were created to address how young users communicate with others on the platform, to provide significant improvements to parental controls, and include a revamped content labeling system, Roblox stated in a blog post announcing the changes.
Starting Monday, any Roblox user under the age of 13 will no longer be able to direct message (DM) other users on the online gaming platform without their parent’s consent.
Parents have also been given the ability to view their child’s friend lists, set spending controls, and manage screen time.
As part of the new measures, parents will now be able to monitor their child’s settings and permissions remotely from their own device, whereas previously, they could only do so on their child’s device.
Additionally, the new restrictions will also prevent users under 13 from searching, discovering, or playing unlabeled experiences, the company said.
“These changes were developed and implemented after multiple rounds of internal research, including interviews, usability studies, and international surveys with parents and kids, and consultation with experts from child safety and media literacy organizations, Roblox said.
New parental controls are live on Roblox. You can set screen time and spending limits, manage your child’s communication settings, and control the type of content your child can access. Learn more: https://t.co/kY5QcZqyii pic.twitter.com/xtBtn1XJW5
undefined Roblox (@Roblox) November 18, 2024
Another communication safety feature will limit users under age 13 to public broadcast messages only within a game or experience.
The platform’s original age-based content labels will be replaced with descriptors ranging from "Minimal" to "Restricted," indicating the type of content users can expect.
By default, users under nine will only have access to games labeled "Minimal" or "Mild."
Restricted content will remain blocked for users under the age of 17, and only then will be accessible if the user has verified their age, the company noted.
Preventing sexual predators from targeting kids
With close to 90 million reported users in the last quarter, the immersive gaming system, in which users can also program and play their own creations from scratch, as well as play with and interact with other online gaming enthusiasts, is meant to foster connections between users.
According to Statista.com, children 12 and under make up the largest demographic of Roblox’s users.
In the meantime, reports of child abuse and sexual predators targeting and/or grooming younger users have been an issue plaguing the free gaming platform since its inception in 2006.
Calling Roblox out for its “grossly insufficient” child safety standards in a scathing article released over the summer, the National Center on Sexual Exploitation (NCSE) in Washington DC says the most popular Roblox experiences (games played with others) are categorized as “role-playing,” and that the platform often “acts as a first point of contact between abusers and their victims.”
The child advocacy organization said that among Roblox's "avatars, blocks, and buildings, kids are exposed to predators, rape-themed games, and age-inappropriate content like sex parties."
Improving safety controls on Roblox and other similar platforms “is key in preventing sexual exploitation for the millions of young children who use them,” the NCSE said.
The NCSE expose further highlights four separate cases in 2023 in which child sexual predators had actually physically assaulted – even including one straight-out kidnapping – underage victims the perpetrators had met and began interacting with on the platform.
A 2022 lawsuit filed in San Francisco charged Roblox with facilitating the sexual and financial exploitation of a California girl by adult men, allegedly encouraging her to drink, abuse prescription drugs, and share sexually explicit photos, Reuters reported.
And in August, the Turkish government announced it was banning Roblox from operating in the country due to monitoring challenges and inappropriate sexual content found on the platform.
Roblox was accused of hosting virtual gatherings that promoted pedophilia, local media in Turkey had reported at the time.
Additionally, it was alleged that the platform’s bots were encouraging children to participate in inappropriate activities through its “Robux” currency promotions.
This spring, the NCSE had sent a letter with request list of a half a dozen recommended safety features the company needed to implement to ensure child protections. It appears with this latest update, all six requests were met by Roblox.
Your email address will not be published. Required fields are markedmarked