Starting in December 2025, Roblox will block children from chatting with adults or much older teen strangers, using facial age estimation to ensure kids can only talk with others in a similar age group.
According to The Guardian, the move follows lawsuits claiming the online gaming platform — now reaching 150 million daily players — has been exploited by predators to groom children as young as 7, alleging its design made them “easy prey for paedophiles.”
Roblox’s new system will sort users into age bands: under 9, 9 to 12, 13 to 15, 16 to 17, 18 to 20, and 21 and over. Children will be allowed to chat only with users in their own or adjacent groups — for instance, a 12-year-old will only be able to communicate with those under 16. Roblox says it will not store images and videos used for age checks.
Roblox will first introduce the change in Australia, New Zealand, and the Netherlands, with a global rollout scheduled for early January 2026.
“We see it as a way for our users to have more trust in who the other people they are talking with are in these games,” said Roblox Chief Safety Officer Matt Kaufman, The Guardian reports. “And so we see it as a real opportunity to build confidence in the platform and build confidence amongst our users.”
Predator Allegations Target Roblox Chat Features
One recent lawsuit filed in the U.S. District Court of Nevada alleges that a “dangerous child predator” targeted a 13-year-old girl by building an emotional connection and manipulating her into sharing her phone number, per The Guardian. He then allegedly sent graphic messages and coerced the girl into sending explicit images and videos of herself.
The suit claims that “had [Roblox] taken any steps to screen users before allowing them on the apps, [the girl] would not have been exposed to the large number of predators trolling the platform.” It also says the predator wouldn’t have been able to harm the girl if age and identity verification had been used.
A Roblox spokesperson said the company is “deeply troubled by any incident that endangers any user” and prioritizes community safety.
“This is why our policies are purposely stricter than those found on many other platforms,” the spokesperson said, The Guardian reports. “We limit chat for younger users, don’t allow user-to-user image sharing, and have filters designed to block the sharing of personal information.
“We also understand that no system is perfect and that is why we are constantly working to further improve our safety tools and platform restrictions to ensure parents can trust us to help keep their children safe online, launching 145 new initiatives this year alone,” the person added.
Roblox says it will be the first online gaming or communication platform to mandate age verification for chatting.
“It’s not enough just for one platform to hold a high standard for safety,” Kaufman said, per The Guardian. “We really hope the rest of the industry follows suit with some of the things that we’re doing, to really raise the protections for kids and teens online everywhere.”

