, , , ,

Roblox Introduces Child-Safe Accounts Following Grooming Allegations

Roblox, a widely-used social gaming platform, is set to roll out new account types specifically designed for children and teenagers, aimed at enhancing safety features and expanding parental oversight. This initiative follows concerns raised by the Australian government regarding incidents of child grooming occurring within the platform.

The new account options will be available globally starting in June. The platform will introduce two distinct accounts: “Roblox Kids” for users aged five to eight, and “Roblox Select” for those aged nine to 15. To ensure accurate age categorization, Roblox will implement an age verification system that may include facial recognition technology.

Users of Roblox Kids will have access solely to games rated for “minimal” or “mild” content, with chat functions disabled by default. This account type will feature a distinctive electric blue background to differentiate it from other accounts. In contrast, Roblox Select users can engage with games rated as “moderate” and will have limited chat capabilities introduced gradually, allowing communication with family, friends, or peers of similar ages.

Matt Kaufman, Roblox’s chief safety officer, stated, “While no system is flawless, these age-specific accounts are intended to alleviate uncertainties for parents and ensure that the user experience aligns with their age.” All games will undergo a thorough review before the implementation of these changes, including real-time assessments of how older users (aged 16 and above) interact with the platform.

Additionally, the platform is enhancing its parental control features, empowering parents to block particular games and manage chat functions until their child turns 16. Roblox serves as an online gaming platform that enables users to create and play games made by themselves or others, incorporating messaging capabilities that allow players to communicate through text and voice.

Since 2022, Roblox has experienced significant growth, boasting approximately 111 million daily users, with around 40% of them being under the age of 13. Australia ranks as the second-largest market for Roblox, following the United States, and it is the leading gaming application among Australian children aged four to 18.

In a further move to aid families in navigating age-appropriate content, Roblox plans to collaborate with the Australian Classification Board to assign game ratings starting in 2026. Communications Minister Anika Wells met with Roblox representatives in February to address concerns regarding explicit and harmful content affecting children, including issues related to grooming by online predators.

In her correspondence to Roblox prior to their meeting, Wells expressed her alarm over the ongoing reports of children being targeted by individuals seeking to exploit them. Following significant pressure, Roblox has committed to implementing substantial changes to enhance safety for its younger audience.

Wells remarked that the government will closely monitor the implementation of these changes, emphasizing the need for safety measures on the platform, not only in Australia but globally. “Children should enjoy their favorite games without facing exposure to harmful material,” she added.

While Roblox is not formally part of the Australian government’s social media restrictions for users under 16, which began in December 2025, it remains subject to potential fines of up to $49.5 million for non-compliance. New regulations aimed at age-restricted content, such as pornography and self-harm, took effect on March 9, which also apply to Roblox. These regulations mandate that gaming platforms take appropriate actions against non-consensual sharing of intimate images, grooming, and sexual extortion.

As part of these broader safety measures, platforms like TikTok and Instagram, which are also popular among young Australians, are adapting to comply with the government’s restrictions. TikTok, owned by ByteDance, has acknowledged its compliance, while experts have warned that such bans could inadvertently push younger users toward less secure areas of the internet.

Instagram, owned by Meta, automatically applies “teen accounts” to users aged 13 to 17, incorporating limitations on interactions and content filtering. Despite these precautions, Instagram is also included in the under-16 social media ban. Similarly, Snapchat remains a favored app among youth, with a considerable portion of its user base under the age of 17.


AI Search


NewsDive-Search

🌍 Detecting your location…

Select a Newspaper

Breaking News Latest Business Economy Political Sports Entertainment International

Search Results

Searching for news and generating AI summary…