Our platform and the broader gaming community are constantly evolving. As the gaming industry continues to grow, safety systems require even more depth, speed, and agility to protect players from potential toxicity. Today, we are releasing our third Xbox Transparency Report, which provides data and insight about the work we are doing to create a safer and more inclusive environment for our players to enjoy great gaming experiences.
At Xbox, we continue to adapt our technologies and approaches to keep up with industry changes, which include advancing the exploration and use of artificial intelligence (AI). AI is becoming increasingly important in the acceleration of content moderation around the world. Our team is actively innovating with the responsible application of AI to pursue safer player experiences that can be utilized across the gaming industry. Our efforts are aimed at combining the importance of human oversight with the evolving capabilities of AI to build on the foundation of our safety work to date.
Currently, we use a range of existing AI models to detect toxic content including Community Sift, an AI-powered and human insights-driven content moderation platform that classifies and filters billions of human interactions per year, powering many of our safety systems on Xbox; and Turing Bletchley v3, the multi-lingual model that scans user-generated imagery to ensure only appropriate content is shown. We’re actively developing ways in which our systems can be further enhanced by AI and Community Sift to better understand the context of interactions, achieve greater scale for our players, elevate and augment the capabilities of our human moderators, and reduce exposure to sensitive content.
Many of our systems are developed to help players feel safe, included, and respected. Our proactive technology and approach allow us to block content from being shared on our platform before it reaches players, while proactive enforcements curb unwanted content or conduct on the platform. As noted in the Transparency Report, 87% (17.09M) of total enforcements this period were through our proactive moderation efforts.
Among the key takeaways in the report:
New insights into blocked content volumes – Preventing toxicity before it reaches our players is a crucial component of our proactive moderation efforts towards providing a welcoming and inclusive experience for all. Our team combines responsible AI with human supervision to review and prevent harmful content from being published on our platform. To better measure our success, we’re now including a new dataset covering our work in this space called ‘Toxicity Prevented’. In this last period, over 4.7M pieces of content were blocked before reaching players, including a 135k increase (+39% from the last period) in imagery thanks to investments in utilizing the new Turing Bletchley v3 foundation model.
Increased emphasis on addressing harassment – We are committed to creating a safe and inclusive environment for all players. We actively work toward identifying and addressing any abusive behavior, including hate speech, bullying, and harassment. With that goal in mind, we’ve made improvements to our internal processes to increase our proactive enforcement efforts in this past period by issuing 84k harassment/bullying proactive enforcements (+95% from the last period). We also launched our new voice reporting feature to capture and report in-game voice harassment. Our safety team continues to take proactive steps to ensure that all players are aware that abusive behavior of any kind is unacceptable on our platform, and we take this behavior seriously.
Understanding player behavior after enforcement – We are always taking the opportunity to learn more about how we can drive a better understanding of the Community Standards with our players. To that end, we’ve been analyzing how players behave after receiving an enforcement. Early insights indicate that the majority of players do not violate the Community Standards after receiving an enforcement and engage positively with the community. To further support players in understanding what is and is not acceptable behavior, we recently launched our Enforcement Strike System, which is designed to better help players understand enforcement severity, the cumulative effect of multiple enforcements, and total impact on their record.
Outside of the Transparency Report and around the world, our team continues to work closely to drive innovation in safety across the gaming industry:
Minecraft and GamerSafer partner to promote servers committed to safety. Minecraft developer Mojang Studios has partnered with GamerSafer, along with members of the Minecraft community, to curate the Official Minecraft Server List, so players can easily find third-party servers committed to safe and secure practices. Mojang Studios and GamerSafer work together regularly to update policies and safety standards that are required for servers to be listed on the site. Servers featured comply with Minecraft Usage Guidelines and demonstrate certain requirements, including providing the purpose of the server, intended audience, and foundational community management practices that set the tone, values, and principles of each server. In addition, servers can earn different badges to show commitment to safety and security best practices. Server community managers can sign up to have their server listed, and players can report issues or contact a server directly to ask questions. The site empowers not only server community managers to craft fun games and experiences with safety in mind, but also offers an easy resource for players or parents to find and explore positive server experiences.
Launch of Xbox Gaming Safety Toolkits for Japan and Singapore. These local resources empower parents and caregivers to better understand online safety in gaming and manage their children’s experience on Xbox. The toolkits cover common safety risks, age-specific guidance for kids of all ages, and the features available on Xbox to help make the player experience safer for everyone in the family. Previous releases include the Xbox Gaming Safety Toolkit for Australia and New Zealand.
Together, we continue to build a community where everyone can have fun, free from fear and intimidation and within the boundaries that they set. This means bringing more tools and systems in place to empower players to respectfully interact with each other. Recently introduced safety measures include our voice reporting feature, giving players the option to capture and report any inappropriate voice activity on any multiplayer game with in-game voice chat, as well as the enforcement strike system, providing players with more information about how their behavior impacts their overall experience on the platform.
Every player has a role in creating a positive and inviting environment for all, and we look forward to continuing to bring everyone along on our safety journey.
Some additional resources:
Share feedback via the Xbox Insiders program or on the Xbox Support website
Read our Xbox Community Standards
Learn about the Xbox Family Settings app and download the app when you’re ready
Keep up to speed on Privacy and Online Safety
Remain informed on How to Report a Player and How to Submit a Case Review
Minecraft Education
Need help? Request a Call, Chat Online, and More
The post Third Xbox Transparency Report Shows Our Evolving Approach to Creating Safer Gaming Experiences appeared first on Xbox Wire.