LONDON: Popular online gaming giant Roblox has announced sweeping new safety measures aimed at protecting its youngest users, with stricter age controls set to roll out before the end of this year.
The company’s Chief Safety Officer, Matt Kaufman, revealed in a recent blog post that Roblox will now extend age estimation and verification features across all users who access on-platform communication tools. The move is designed to limit interaction between adults and children unless a verified real-world connection exists.
Roblox, which attracts nearly 100 million daily active players worldwide—with more than 40 percent under the age of 13—has faced mounting criticism over child safety lapses. Parents, regulators, and watchdog groups have repeatedly raised alarms about the platform being misused for inappropriate contact and exploitation.
To address these concerns, Roblox will employ a three-tier verification system: facial age estimation, official ID checks, and parental consent requirements. This marks one of the platform’s most ambitious steps yet to bolster safety and accountability.
The announcement comes amid growing international scrutiny. Regulators in the UK and EU have passed tougher online safety laws, compelling platforms to enforce stricter user verification. In the United States, Roblox is also battling lawsuits, including one filed in Louisiana accusing the platform of failing to prevent child exploitation.
Earlier this year, investor firm Hindenburg Research accused Roblox of falling short on transparency and user safety. The company’s new measures appear to be a direct response to these pressures, as well as an effort to restore trust among parents and policymakers.
With its enormous young user base, Roblox’s changes could reshape how online gaming platforms approach child safety in the future, placing added responsibility on tech companies to safeguard minors in digital spaces.
This story has been reported by PakTribune. All rights reserved.