Our Commitment to Enhance User Trust and Safety
It’s been an incredible journey for VRChat to grow into the vibrant, diverse, and unique universe it is today. From spontaneous new friendships and deep philosophical discussions, to artistic exhibitions and immersive roleplaying, you – our community! – are the heart and soul of everything we do. Your creativity and passion continue to shape VRChat every single day, and the stories you tell of your experiences keep us going.
Our communities are central to VRChat. As such, the safety and well-being of people within VRChat remain our absolute highest priority.
For VRChat to flourish, it needs to be a place where everyone feels secure, respected, and empowered to express themselves freely. Our focus is on fostering an environment where positive interactions can thrive and where support is readily available when needed.
Trust and Safety is a critical part of this system. This includes our Terms of Service, Community & Creator Guidelines, report and moderation systems, self-moderation systems, dutiful Trust and Safety team, and more. To keep you, the communities you’re a part of, and VRChat itself safe, we must constantly learn and evolve how our Trust and Safety ecosystem operates.
Today, we want to share how VRChat will evolve our Trust and Safety programs to reinforce our commitment to keeping our community safe.
Growing Our Trust & Safety Team
To further strengthen our commitment, we’re evolving the structure of our Trust & Safety team. This includes bringing on more support to improve our moderation coverage and expanding our team with several Trust & Safety professionals in high-impact roles.
Our goal is to raise the bar on our capabilities and stay ahead in defining what safety means in immersive social spaces. We’ve recently filled a few key positions and will continue to seek individuals with deep expertise and fresh perspectives as future needs arise.
Updating Our Community Guidelines
We are updating our Community Guidelines, effective 24 September, 2025. While the core rules are not changing, we’re introducing a new internal policy framework to enhance our approach to safety and integrity on VRChat.
This framework is categorized into seven key verticals:
Adult Sexual Activities and Nudity
Hate, Violence, and Endangerment
Minor Safety
Suicide and Self-Harm
Platform Integrity
Regulated Goods
Copyright
These categories help us clearly identify and understand the behaviors and content that pose safety risks. That, in turn, enables us to better anticipate and address those risks through product and moderation improvements. By structuring our policies this way, we can be more data-driven in prioritizing safety efforts.
As a result, we’ve updated our Community Guidelines to align with this framework. You’ll notice clarified language, cleaned-up sections, and a structure that better reflects how we think about safety. Going forward, we’ll continue evolving these guidelines to help prevent harm while preserving the creative freedom and expression that defines VRChat.
Enhanced Reporting Tools & User Controls
Clear and accessible reporting tools are critical to keeping VRChat safe and welcoming for everyone. Unfortunately, research shows many users don’t report harmful behavior, often because the process isn’t intuitive or doesn’t clearly explain what’s expected (S.B. et al., 2024).
We’re working to change that. VRChat is redesigning the reporting experience to make it simpler to navigate, more transparent about what’s against the rules and more empowering for users to take action and protect themselves.
Reporting categories will better reflect our updated Community Guidelines.
Plain-language guidance will make it easier for all users to understand when and how to report.
Links will be integrated throughout the reporting flow to provide more relevant information, such as links to crisis resources or instructions for safety features.
Users will also be able to block reported individuals or hide inappropriate content during the reporting process.
Proactive System Enhancement
In addition to refining our reactive moderation efforts, we're investing in proactive systems that help us detect harmful content, behavior, and activity before they escalate. We are designing and utilizing these systems first by being mindful of the balance between respecting user privacy and maintaining safety on our platform.
As VRChat continues to grow and evolve, so will our trust and safety processes, expectations, priorities, and focuses. In every community, there will always be new challenges. Our commitment to you is that we will always support the positive evolution of VRChat, protect its spirit, and ensure it remains a special, meaningful place for real human connection, creativity, and self-expression.
Thanks for being part of this incredible community. We’re excited to keep building a safer, more welcoming VRChat along with you.