Policies & Procedures Governing Content Moderation
Safety on Roblox is at the core of everything we do. This starts with our robust set of Community Standards that outline how we expect our users to behave and to be clear up front about what is and isn’t allowed on Roblox. We use these Standards to moderate content and respond to abuse reports from users. If any users are found to be violating these Standards, we may take a range of actions, as described further below.
General Processes & Systems
Monitoring and Review Processes
We have a stringent safety and monitoring system integrated into Roblox to promote civility and ensure the safety of our users. These systems are designed to enforce our policies, protect users’ personal information, and abide by local laws. We leverage moderators and automated systems to proactively identify behaviors that may violate our policies.
Content submitted by creators automatically goes through a multi-step review process before appearing on the platform. Content includes images, meshes, audio files, and video files that creators upload to Roblox to include in their experiences. Images are evaluated for Child Sexual Abuse Material (CSAM) and other inappropriate content, while audio files are scanned for IP infringement. If an image is identified as CSAM, it is automatically reported to the National Center for Missing and Exploited Children. Finally, content that has not been flagged for removal is subject to human review, for example, if the content is reported to us as violating our policies. Roblox also has systems in place to prevent the re-upload of content previously removed. This includes technology that reviews for content that is the same or similar to what we’ve already taken down.
When experiences are published or updated on Roblox, they are evaluated by a suite of tools that identify problematic language, potential bypasses to our filters, and content that falls outside our policies. A human review team is continuously operating to evaluate flagged experiences. Roblox moderation processes include a suite of anti-intruder technology leveraging machine learning, throttles, and circuit breakers to block automated bot attacks and mitigate the impact of humans who attempt to spam users and disrupt the service. We also leverage automated penetration testing, a bug bounty program, code threat assessments, and vulnerability management tools to ensure the safety of our users and the platform. Our Safety team strives to take swift action and respond within minutes to address any content or behavior that violates our Terms of Use or Community Standards.
Reporting Processes
User Reports
On Roblox, users can easily report inappropriate content or behavior using our Report Abuse system located prominently throughout Roblox, including within experiences. We also encourage our users to report any activity they feel concerned about directly to Roblox. Roblox also provides users with a direct channel to our Customer Support team to report concerns or other issues.
EU users may also report content that they believe violates the law of EU member states using our EU illegal content reporting form. Content reported via this form is first assessed against our Community Standards - if the content is incompatible with our platform-level policies, it is actioned accordingly. Any reported items that do not violate our Community Standards are then escalated to legal experts for assessment against relevant local laws.
Trusted Flaggers
Roblox has a trusted flagger program, where close and trusted partners can report terrorist content to us. We’ve made a commitment to them that their reported content will be reviewed and, if necessary, actioned within a short timeframe. Additionally, EU Trusted Flaggers accredited in accordance with the Digital Services Act may submit notices of Illegal Content via our EU Illegal Content reporting form. Such reports will be prioritized by our team of moderators.
Consequences
Whenever we make a decision to restrict access to content on the platform, we notify users of our decision and provide them with an opportunity to appeal. Depending on the severity and impact of a content-based Community Standard violation, we take action ranging from warnings, content removal, or Roblox account-level restrictions. In circumstances that present an imminent risk of harm, users may be reported to the relevant authorities.
The duration of our decision and how we decide to take action based on content violations varies, we have processes in place that are strictly manual, automated, or a combination of both. In addition to considering the specific violation, we also take into account a user's historical behavior on the platform and whether they have repeatedly violated our policies. Repeated violations of our policies may increase the severity of consequences (i.e., a warning, followed by a timeout, followed by a suspension, etc.).
Appealing a Content Moderation Decision
Users can contact Roblox to request a review of moderation decisions. Our team will give the moderation decision a second look and make necessary adjustments to an account's moderation status. We take all appeal requests seriously. When reviewing an appeal request, we holistically consider the severity of the violation, the user's reason for appealing, and the user’s behavior on the platform. Please note, in some circumstances, appeals may not be readily applicable — for example, a time-lapsed consequence such as a 20-minute timeout where the suspension has already been lifted. In addition to our appeals process, all users may seek to resolve disputes in accordance with our Terms of Use.
In accordance with the Digital Services Act, those who report illegal content on our platform via our EU illegal content reporting form also have the right to appeal Roblox’s decision related to their report. EU users can also appeal a moderation decision for up to 6 months after our initial decision. For disputes related to content moderation, EU users can select a certified out-of-court dispute settlement body to resolve disputes relating to a content moderation decision we take. The EU Commission will maintain a list of accredited out-of-court dispute settlement bodies for this purpose.
Terminating Use of Roblox
Users can discontinue use of the service at any time by requesting an account deletion in-app or via our Support Form, using the Right to Be Forgotten request feature. For more information on how to initiate this process, please refer to our relevant Support article.