Roblox Rolls Out Proactive Age-Based Blocking To Combat Grooming Risks

Wednesday, 19 November 2025

    Share:
Author: Afnan Syabil
Roblox introduces automated filters that restrict communication between adult and under-13 accounts, a proactive step to disrupt grooming attempts and enhance platform safety. (Roblox)

United States - In a major escalation of its child protection efforts, Roblox Corporation has unveiled a new automated safety framework designed to preemptively separate adult and minor user bases. The system's primary function is to detect and block direct private interactions initiated by users identified as adults toward accounts belonging to children under the age of 13. This initiative marks a strategic shift from reactive content moderation to proactive environmental design, seeking to eliminate the initial point of contact that predators often exploit.

The technical mechanism relies on a layered approach to age assurance. While a user's self-reported birthday sets an initial flag, the platform will increasingly rely on its optional age verification service, which uses document scanning and facial recognition technology, to confirm users who are 13 and older. Accounts that are either verified as adult or exhibit behavioral patterns consistent with an adult user will be algorithmically restricted from sending private messages, friend requests, or joining the same private servers as accounts tagged as under-13.

This policy is designed to target the specific tactic of "grooming," where an adult builds a deceptive relationship with a child to exploit them. By removing the private, one-on-one communication channel that is essential for grooming to begin, Roblox aims to disrupt the predator's playbook at the first stage. The platform's vast array of public experiences will remain accessible, but private, unmoderated spaces where risky interactions can flourish will be heavily restricted for younger users.

The decision follows intensified scrutiny of the platform's role in child safety. Past reports have documented cases where predators, posing as friendly peers, used Roblox's chat and private message systems to build trust with children before moving conversations to other, less-monitored platforms. Facing criticism from parent groups and policymakers, Roblox is now attempting to close this loophole with a technical solution that does not rely solely on human moderators or user reports.

David Baszucki, Co-founder and CEO of Roblox, emphasized the company's commitment to innovation in safety. "We believe we have a responsibility to build not just engaging experiences, but inherently safer ones," Baszucki said. "This new system uses our most advanced AI to understand and enforce these boundaries in real-time, making it harder for anyone with ill intent to operate on our platform." The company plans to continue refining its algorithms based on new data and threats.

Inevitably, the rollout presents challenges. Strict age segregation could complicate positive cross-age interactions, such as a teenager mentoring a younger sibling or a child participating in a family-friendly event hosted by adult creators. Roblox acknowledges this and suggests its "Parental Controls" and "Supervised Friend Requests" will provide avenues for vetted, exception-based interactions, placing the control and oversight firmly in the hands of parents or guardians.

Reaction from the safety community has been largely positive but measured. "Automatically limiting contact between adults and children in digital spaces is a strong, preventive measure that more platforms should consider," noted a child psychologist specializing in online behavior. The expert added that while technical barriers are crucial, they must be part of a holistic strategy that includes parental oversight and media literacy education for children themselves.

By instituting this automated age-based blocking, Roblox is making a clear statement about the future of online safety for minors. It demonstrates a move toward "safety by design," where protective features are built into the platform's architecture. This approach could influence regulatory discussions and set a benchmark that pressures other social gaming and communication platforms to adopt similarly aggressive, preventive measures to protect their youngest users.

(Afnan Syabil)

Read Too: Xiaomi 17 Ultra To Debut With Revolutionary 200MP Periscope, Leica Optics

Tag:


    Share:

We would appreciate your comments
Comments are your responsibility according to the ITE Law.