Xbox Releases Second Transparency Report

Xbox Releases Second Transparency Report

Video game production can be a little bit of a disconnected industry. It can sometimes feel like the game development industry’s leading companies are up in their ivory towers pumping out content, and gamers offer feedback… that isn’t always of positive persuasion.

That’s what Xbox hopes to highlight and uphold in its second Transparency Report, which was released this week.

The report serves as reinforcement for the Washington-based company’s investments and prioritization of player safety. It highlights its multifaceted and innovative approach in doing so, which is a job that involves countless people in different departments.

The data — collected from July to December 2022 –look into how to curve toxicity and stop inappropriate content before players are even exposed to it. In the preemptive stage, the AI and human-driven Community Sift platform helps identify offensive content within milliseconds across text, images, and video. Community Sift assessed over 20 billion human interactions on Xbox in the last year alone.

During this report’s collection period, 80% of Xbox’s total enforcements issued were the result of proactive moderation efforts. Even more impressive, proactive measures also detected and enforced against 100% of account tampering, piracy, phishing, and cheating/inauthentic accounts.

The report also reaffirms the company’s Community Standards, which have been around since the early days of the Gaming For Everyone initiative. And while gaming is undoubtedly for more people than it was in 2015 — when that strategy began — there’s still further to go. So Xbox is using this report to extend its definition of vulgar content to include offensive gestures, sexualized content, and crude humour. This type of content is generally viewed as distasteful and inappropriate, detracting from the core gaming experience for many of Xbox’s players. This policy change — in conjunction with improvements to its image classifiers — has resulted in a 450% increase in enforcement in vulgar content, with 90.2% being proactively moderated. These enforcements often result in just removing the inappropriate content, reflecting the 390% increase in “content-only” enforcements in this period.

Lastly, Xbox’s proactive moderation of inauthentic accounts is up 16.5 times from the same period during last year’s report. Inauthentic accounts are typically automated or bot-created accounts that create an unlevel playing field and can detract from positive player experiences. The Xbox Safety team issued more than 7.51M proactive enforcements against fake accounts, representing 74% of the total enforcements in the reporting period — up from 57% last reporting period).

Xbox continues to invest in and improve its tech so players can have safe, positive, and inviting experiences. However, that involves an informed player community that knows how to help, so here are some great resources to use!