Roblox to ban young children from messaging others


Getty Images Roblox displayed on a smartphone in a close-up shot of a person's handGetty Images

Roblox has announced it will block under-13s from messaging others on the online gaming platform as part of new efforts to safeguard children.

Child users will not be able to send direct messages within games by default unless a verified parent or guardian gives them permission.

Parents will also be able to view and manage their child’s account, including seeing their list of online friends, and setting daily limits on their play time.

Roblox is the most popular gaming platform for eight to 12 year olds in the UK, according to Ofcom research, but it has been urged to make its experiences safer for children.

The company said it would begin rolling out the changes from Monday, and they will be fully implemented by the end of March 2025.

It means young children will still be able to access public conversations seen by everyone in games – so they can still talk to their friends – but cannot have private conversations without parental consent.

Matt Kaufman, Roblox’s chief safety officer, said the game is played by 88 million people each day, and over 10% of its total employees – equating to thousands of people – work on the platform’s safety features.

“As our platform has grown in scale, we have always recognised that our approach to safety must evolve with it,” he said.

Besides banning children from sending direct messages (DMs) across the platform, it will give parents more ways to easily see and manage their child’s activity.

grey placeholderRoblox Two screengrabs from smartphones show how Roblox's parental controls dashboard will appear to users. The left screengrab shows a child's screen time, friends and a list of settings the parent user can manage. The screenshot on the right gives them options to control the maturity of the content their child views.Roblox

The platform says parents will be able to more easily manage controls such as what content their child sees and when they can send direct messages

Parents and guardians must verify their identity and age with a form of government-issued ID or a credit card in order to access parental permissions for their child, via their own linked account.

But Mr Kaufman acknowledged identity verification is a challenge being faced by a lot of tech companies, and called on parents to make sure a child has the correct age on their account.

“Our goal is to keep all users safe, no matter what age they are,” he said.

“We encourage parents to be working with their kids to create accounts and hopefully ensure that their kids are using their accurate age when they sign up.”

Maturity guidlines

Roblox also announced it planned to simplify descriptions for content on the platform.

It is replacing age recommendations for certain games and experiences to “content labels” that simply outline the nature of the game.

It said this meant parents could make decisions based on the maturity of their child, rather than their age.

These range from “minimal”, potentially including occasional mild violence or fear, to “restricted” – potentially containing more mature content such as strong violence, language or lots of realistic blood.

By default, Roblox users under the age of nine will only be able to access “minimal” or “mild” experiences – but parents can allow them to play “moderate” games by giving consent.

But users cannot access “restricted” games until they are at least 17-years-old and have used the platform’s tools to verify their age.

It follows an announcement in November that Roblox would be barring under-13s from “social hangouts”, where players can communicate with each other using text or voice messages, from Monday.

It also told developers that from 3 December, Roblox game creators would need to specify whether their games are suitable for children and block games for under-13s that do not provide this information.

The changes come as platforms accessed and used by children in the UK prepare to meet new rules around illegal and harmful material on their platforms under the Online Safety Act.

Ofcom, the UK watchdog enforcing the law, has warned that companies will face punishments if they fail to keep children safe on their platforms.

It will publish its codes of practice for companies to abide by in December.



Source link