How we protect our Roblox community:
Roblox’s vision is to connect billions of people through shared experiences in the Metaverse. Safety and civility are foundational to this vision and creating a platform where everyone feels welcome. That’s why we’ve always made it a key priority to ensure our community members can connect, create, and come together in a space that is welcoming, safe, inclusive and respectful. We’ve spent over a decade building a stringent safety system and policies that we are proud of and that we are continuously evolving as our community grows.
Our Policies
Our Community Standards set clear expectations for how to behave on Roblox. They require that everyone treat each other with civility and respect and help keep everyone safe by prohibiting content and behavior that may be inappropriate or harmful. Our Trust and Safety team uses these Standards to moderate content and respond to abuse reports from users. If any users are found to be violating these standards, they may be suspended or removed from the platform. In some cases, Roblox also works proactively with authorities to report cases of violent threats, child endangerment, or other serious real world harm. We update our Community Standards periodically as the needs of our community and the product change.
Our Teams & Tools
Trust & Safety team: We have a large, expertly trained team with thousands of members dedicated to protecting our users and monitoring 24/7 for inappropriate content. Our Trust & Safety team takes swift action (typically within minutes) to address any content or behavior that violates our Terms of Use or Community Standards.
Proactive safety review of uploaded assets: We conduct a safety review of every uploaded image, audio, and video file, using a combination of review by a large team of human moderators and machine detection before these assets become available on our platform for our community to utilize in their creations.
Parental Controls: We offer parental controls on Roblox to choose how your child engages and interacts with others across Roblox. You can make informed decisions about what is right for your family and manage your child’s account features such as screen time, content maturity, spend limits, and privacy settings.
Content maturity: We’ve introduced content maturity labels, to help users and their parents understand what types of content to expect in Roblox experiences. These labels are grounded in child development research and informed by industry standards. Parents can use the content maturity setting in parental controls to restrict their child’s access to content based on what makes sense for their family.
Reporting: We actively encourage our users to report any activity or content they feel uncomfortable or concerned about. Users can easily mute or block players that they come across in games and report inappropriate content/behavior using our Report Abuse system. When a user submits a report about any rule violations, our safety team assesses complaints deemed actionable to ensure appropriate action is taken as quickly as possible. That action can include temporary suspension, account or content removal, or proactive reporting to authorities.
Chat filtering: We filter all text chat on the platform to block inappropriate content, such as discriminatory speech, bullying, extremism, violence, sexual content, etc., as well as personal information and instructions on how to move off the platform. We update our filters continuously, multiple times a day. For users under 13, our filters are even stricter and include any potentially identifiable personal information, slang, etc. In order for us to maintain a safe platform and prevent inappropriate content in violation of our policies, users are also not able to exchange images or videos via chat or one-to-one user interactions on Roblox.
Privacy: We do not share any personal data with third parties for users under 13 beyond what is permitted by COPPA, and overall, we limit data collection to what’s necessary and acceptable under our strict privacy policy in order to run a safe and efficient platform. We are COPPA certified by kidSAFE (we are a member of their kidSAFE Seal Program), and we regularly review our privacy policy to ensure we process your information in ways that comply with it.
Our Partners
Collaboration with safety organizations and other platforms: We have partnerships with over 20 leading global organizations that focus on child safety and internet safety including the WePROTECT Global Alliance, the Internet Watch Foundation (IWF), the UK Safer Internet Centre, Fair Play Alliance, Family Online Safety Institute (FOSI), Digital Wellness Lab, Connect Safely, and kidSAFE among others.
- We are also a member of various industry organizations, such as UKIE and the Technology Coalition with a goal of cross-industry collaboration in the areas of user safety, and child safety. As a member of the Technology Coalition, we are committed to the Voluntary Principles, including transparency on our efforts to combat online child sexual exploitation and abuse.
- Additionally, we work with individual companies where we see opportunities to share learnings and development efforts. For example, we worked with Microsoft on a cross-industry project to develop an AI-based technique that scans text-based chats and provide better tools for grooming chat detection.
- And we work closely with other chat, social media, and UGC (User Generated Content) platforms to report bad actors and content, so they can also take appropriate action on their platforms.
In consultation with expert organizations like Anti-Defamation League, Tech Against Terrorism (TAT) and The Simon Wiesenthal Center, as well as academics and safety partners from across the globe, we are constantly evaluating our moderation policies. We frequently audit our platform to ensure we are continually strengthening our processes and algorithms to prevent, detect, and block new content or behavior that violates our Terms of Use. Our aim is to protect our community against emergent discriminatory or harmful content, terms, memes, and symbols.
Collaboration with authorities: We work closely and transparently with regulators, authorities, and safety groups in every country we operate and promptly report any suspected child exploitation, abuse materials, or online grooming to relevant authorities like the National Crime Agency and Child Exploitation and Online Protection Command in the UK, as well as the National Center for Missing & Exploited Children (NCMEC) in the United States.
Community Education & Resources
Digital Civility Initiative: We believe community education is incredibly important. In 2019 we created our Digital Civility Initiative to provide actionable resources for parents, caregivers, and educators to promote learning about safety and digital civility. This effort also seeks to teach kids and teens how to interact online in a positive way and equip them with the tools to recognize questionable or bad behavior.
- As part of our commitment to digital civility, Roblox has formed a Trust & Safety Advisory Board comprised of world-renowned digital safety authorities. We regularly share updates on our progress with the overall Digital Civility Initiative, including results from our related research and community listening.
We are constantly looking for ways to make the Roblox platform safer and will continually update this page with our latest policies, tools and resources.