Technology

Roblox Will Roll Out Age-Based Accounts Amid Child Safety Push

Developing global laws intended to limit the child’s access to dangerous content, and incidents and cases related to youth online safetymake roblox update how to manage children’s accounts and verifies the age of users.

From the beginning of June, the popular online game platform plans to unite its young users with two new types of accounts: “Roblox Kids” for those aged 5 to 8, and “Roblox Select” for those aged 9 to 15. Everyone else 16 and older will be in an age group called “Roblox.”

The company said that Kids and Select accounts will have a different background treatment across Roblox apps to make it clear which account type is being used. Accounts will be assigned to age groups as determined by the platform’s global technology for age verification or verified parentage.

CEO and founder of Roblox, David Baszucki, detailed the changes in a blog post on Monday, including limits for each type of account and parental control options that will soon be available on Roblox.

“When it comes to safety, we do the right thing, including active filtering, age checks, parental controls, and providing clear content ratings, because the well-being of our community is our top priority,” Baszucki said in the post.

Chat functionality has been a particular point of criticism against Roblox, which has faced scrutiny and lawsuits related to online grooming, where adult abusers contact children through unsupervised chats. In a recent incident in the UK, a 19-year-old contacted a 14-year-old through a Roblox chat, then encouraged her to move to other messaging platforms, where she continued to engage in “highly sexual” conversations and share intimate photos and videos.

As part of the new age groups, children’s accounts will have forum chat by default, and access will be limited to games with a Minor or Medium content rating. For Select accounts, chat connections will be “gradually introduced with protections,” and access will be limited to games labeled with Medium content maturity.

Pressure to verify age

As more scrutiny is placed on social media and gaming platforms that attract younger audiences, various countries and states have introduced laws requiring platforms to verify the age of users, often requiring government-issued IDs or parental consent to create accounts.

Companies like Discord, OpenAI and Google-owned YouTube have been taking different approaches to introducing age verification technology. Others are using AI to guess the age of users to ensure that young people are not exposed to inappropriate content or contact.

Discord, for example, introduce its concept to verify the age of users on its site, something it says is already automatic, but has been met with major backlash. Company at last delayed its age verification requirements.

Part of the challenge in using age verification technology is to avoid disruptions in platform interactions. Some companies face challenges in creating systems that are difficult to cheat or bypass and that comply with regulations across states. Age verification laws also face opposition from privacy and free speech campaigners, who say such laws could easily violate First Amendment protections and create privacy risks.



Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button