Roblox Uses Selfie AI to Estimate Your Age
Roblox is introducing a suite of enhanced safety features specifically designed for teenagers aged 13 to 17. These include a new AI-powered age estimation technology, which analyzes a user-submitted video selfie to verify their age group.
The new updates aim to strengthen safety protocols for teens and younger children within the Roblox platform. A central component focuses on granting users aged 13-17 more autonomy than younger players, while still maintaining more safeguards than those for adult users. Notably, teens will now have the ability to establish "trusted connections" with other users. Conversations with these trusted contacts will be exempt from the standard chat filters. According to Roblox, this initiative is intended to foster safer, monitored communication within the platform itself, discouraging users from moving conversations to unmoderated third-party apps where risks are higher.
Trusted connections are meant for individuals who know each other in real life. If a teenager wishes to designate someone over 18 as a trusted contact, the process requires additional verification through a QR code scan or a contact importer tool.
Previously, Roblox primarily used government ID submission to verify a user was 13+ or 18+ for unlocking certain chat features. The platform is now offering an alternative verification method. Users can opt to submit a video selfie, which an AI system then analyzes against a broad and diverse dataset to estimate if the person is over 13. Similar age-check features have been tested by companies like Google and Meta in recent years.
Alongside these features, Roblox is rolling out additional user controls, including options to manage online status, a "Do Not Disturb" mode, and expanded parental controls for accounts linked to teens.
Roblox has faced significant scrutiny over child safety on its platform for years. In 2018, reports emerged of a seven-year-old's avatar being sexually assaulted in-game and a six-year-old being lured into a virtual "sex room." A 2021 investigation by People Make Games alleged the platform's business model exploits child labor. The company was sued in San Francisco in 2022 over accusations it enabled the financial and sexual exploitation of a 10-year-old girl. Further lawsuits in 2023 alleged the platform facilitated an illegal gambling ecosystem and had inadequate safety measures leading to children's exposure to adult content. A 2023 Bloomberg investigation also highlighted a troubling prevalence of child predators on Roblox. That same year, the platform reported over 13,000 cases of suspected child exploitation to authorities, leading to 24 arrests.
"Safety is the foundation of everything we do at Roblox," stated Chief Safety Officer Matt Kaufman in an announcement about the new features. "We aim to set the global standard for safety and civility in online gaming. Our commitment is to deliver deeply engaging and empowering experiences for all ages, while continually innovating how our community connects and interacts."
Latest Articles