The Canadian government is the latest group to express concerns about child safety in Roblox.
As reported by The Logic, the feds were presented with a brief by the Canada Centre for Community Engagement and Prevention of Violence about the risks of bad actors using Roblox to spread hateful content and groom children. This is a particular problem considering close to 100 million people play Roblox every day, with about nearly half of them being under 13.
The Canada Centre report explains that because of Roblox‘s mass popularity and focus on user-generated experiences, Roblox “may impact youth radicalization in unexpected ways.” It adds that this makes moderation difficult, especially in situations where adults lure children off of Roblox and onto separate platforms like Discord.
Of course, developer Roblox Corp. pushed back against these concerns, telling The Logic that “safety is at the core of everything we do” while touting protective measures like “advanced AI-powered detection, monitoring teams, 24/7 moderation, and robust user reporting tools.”
But regardless of what the company might say, there has long been concern about how Roblox Corp. isn’t doing enough. For one, the gaming giant has historically been slow to implement tools like parental controls, but on a deeper level, the scale of Roblox, which allows for the creation of tens of millions of user-generated experiences, means that moderation is just extremely difficult in general.
For instance, we’ve seen how people use Roblox to recreate infamous American mass shootings like Columbine, Uvalde and Parkland and terrorist attacks carried out by white supremacists, and this has even radicalized people to commit real acts of violence. On top of that, it’s long been documented how sexual predators have used Roblox to target children, like how a New Jersey man was arrested for grooming a minor online and having her transported to him across state lines. (I highly recommend checking out a TVO documentary from last year, Dangerous Games: Roblox and the Metaverse Exposed, which explores these issues in great detail.)
The timing of Canada Centre’s report coincides with the federal government’s ongoing discussions about potentially banning children under the age of 16 from social media. While that’s currently only intended to target platforms like Instagram and TikTok, the government acknowledged in the wake of Canada Centre’s brief that video games also must be considered, albeit in a separate manner.
Speaking to The Logic, Culture Minister Marc Miller said video games are “a bit of a different animal,” but as they become “social media-ish, the more they expose themselves to responsibility and potentially regulation.” He’s expected to table legislation related to online safety for children later this year, so it remains to be seen how video games, especially Roblox, might factor into that conversation.
Image credit: Roblox
Source: The Logic Via: Game Developer
