How we protect our Roblox community:
Roblox’s vision is to connect billions of people through shared experiences in the Metaverse. Safety and civility are foundational to this vision and creating a platform where everyone feels welcome. That’s why we’ve always made it a key priority to ensure our community members can connect, create, and come together in a space that is welcoming, safe, inclusive and respectful. We’ve spent over a decade building a stringent safety system and policies that we are proud of and that we are continuously evolving as our community grows.
Our Community Standards set clear expectations for how to behave on Roblox. They require that everyone treat each other with civility and respect and help keep everyone safe by prohibiting content and behavior that may be inappropriate or harmful. Our Trust and Safety team uses these Standards to moderate content and respond to abuse reports from users. If any users are found to be violating these standards, they may be suspended or removed from the platform. In some cases, Roblox also works proactively with authorities to report cases of violent threats, child endangerment, or other serious real world harm. We update our Community Standards periodically as the needs of our community and the product change.
Our Teams & Tools
Proactive safety review of uploaded assets: We conduct a safety review of every uploaded image, audio, and video file, using a combination of review by a large team of human moderators and machine detection before these assets become available on our platform for our community to utilize in their creations.
Parental Controls: We offer parental controls and features such as "Account Restrictions,” so parents and caregivers have the option to limit chat to a curated list of contacts for their kids’ accounts, or turn it off altogether. They can also choose to lock their kid’s "Contact Settings" so they cannot be changed. Some developers also offer private servers (for free or for a small fee, depending on the developer) so that our community members can enjoy experiences with only their friends and people they choose to be included in the server. We have our opt-in Age Verification for anyone 13 years of age or older with a government-issued ID—this service helps enable age-appropriate experiences and capabilities for verified users, and lets them express themselves in a safe and respectful way. Looking forward, we are also developing experience guidelines to match users with content that’s appropriate for their age, giving parents even more control over how their children interact with Roblox.
As we continue to invest in building more tools and controls to protect our community, we know that parents of our younger users are also looking to have more visibility and control over how their kids purchase Robux. Our parental controls have now become the one-stop shop for parents to set monthly spending restrictions on under 13 kids’ accounts and elect to receive email notifications from Roblox on all spend activity. For email notifications, parents will need to make sure to add their email address to the user account under “Account Info” in Settings. The parent PIN is required for any changes to any of these settings, providing an additional layer of security. For additional payment security, Roblox also includes measures like payment verification charges, where parents may be asked to verify micro-transactions with our payment provider and the platform does not store full billing information for any of the accounts.
Reporting: We actively encourage our users to report any activity or content they feel uncomfortable or concerned about. Users can easily mute or block players that they come across in games and report inappropriate content/behavior using our Report Abuse system. When a user submits a report about any rule violations, our safety team assesses complaints deemed actionable to ensure appropriate action is taken as quickly as possible. That action can include temporary suspension, account or content removal, or proactive reporting to authorities.
Chat filtering: We filter all text chat on the platform to block inappropriate content, such as discriminatory speech, bullying, extremism, violence, sexual content, etc., as well as personal information and instructions on how to move off the platform. We update our filters continuously, multiple times a day. For users under 13, our filters are even stricter and include any potentially identifiable personal information, slang, etc. In order for us to maintain a safe platform and prevent inappropriate content in violation of our policies, users are also not able to exchange images or videos via chat or one-to-one user interactions on Roblox.
Collaboration with safety organizations and other platforms: We have partnerships with over 20 leading global organizations that focus on child safety and internet safety including the WePROTECT Global Alliance, the Internet Watch Foundation (IWF), the UK Safer Internet Centre, Fair Play Alliance, Family Online Safety Institute (FOSI), Digital Wellness Lab, Connect Safely, and kidSAFE among others.
- We are also a member of various industry organizations, such as UKIE and the Technology Coalition with a goal of cross-industry collaboration in the areas of user safety, and child safety. As a member of the Technology Coalition, we are committed to the Voluntary Principles, including transparency on our efforts to combat online child sexual exploitation and abuse.
- Additionally, we work with individual companies where we see opportunities to share learnings and development efforts. For example, we worked with Microsoft on a cross-industry project to develop an AI-based technique that scans text-based chats and provide better tools for grooming chat detection.
- And we work closely with other chat, social media, and UGC (User Generated Content) platforms to report bad actors and content, so they can also take appropriate action on their platforms.
Collaboration with authorities: We work closely and transparently with regulators, authorities, and safety groups in every country we operate and promptly report any suspected child exploitation, abuse materials, or online grooming to relevant authorities like the National Crime Agency and Child Exploitation and Online Protection Command in the UK, as well as the National Center for Missing & Exploited Children (NCMEC) in the United States.
Community Education & Resources
Digital Civility Initiative: We believe community education is incredibly important. In 2019 we created our Digital Civility Initiative to provide actionable resources for parents, caregivers, and educators to promote learning about safety and digital civility. This effort also seeks to teach kids and teens how to interact online in a positive way and equip them with the tools to recognize questionable or bad behavior.
- As part of our commitment to digital civility, Roblox has formed a Trust & Safety Advisory Board comprised of world-renowned digital safety authorities. We regularly share updates on our progress with the overall Digital Civility Initiative, including results from our related research and community listening.
We are constantly looking for ways to make the Roblox platform safer and will continually update this page with our latest policies, tools and resources.