Microsoft has published its first ever Xbox Transparency Report, detailing the actions it takes to protect players from inappropriate behaviour, misconduct and more on its platform.
For many years, the company has included its moderation efforts on Xbox within its bi-annual Digital Safety Content Report, but now the company will create a games-specific report so it can go into more detail about what it’s doing in this space.
Dave McCarthy, general manager for Xbox Product Services, tells Gamesindustry.biz this stems from Microsoft’s desire to expand on learnings from previous reports.
“In Microsoft’s last Digital Civility Index, it was noted that nine out of ten respondents identified a need for increased education on how to make the digital world safer – and as we all know, gaming is one of the largest digital environments.
“At Microsoft, we take our responsibility to society and the gaming community very seriously. Microsoft has a long-standing commitment to online safety, and at Xbox, we strive to create a place where everyone can play within the boundaries they set, free from fear and intimidation. The Xbox Transparency Report is a commitment to our community and the industry to share more about what we do to protect and safeguard our community.”
Here are the highlights from the report for January through June 2022:
Overall
Number of enforcement actions: 7.31 million
Player account suspensions: 4.50 million (63% of all enforcements)
Incidents of content removal: 196,000 (3%)
Incidents of both content removal and account suspension: 2.43 million (34%)
There’s a marked increase in total enforcements when compared to the previous three six-month periods. It’s almost triple the number of enforcements compared to the 2.7 million recorded in the second half of 2021, and 82% higher than the four million recorded in the same period last year.
A significant factor in this increase is Microsoft’s investment into proactive enforcement, boosted by the company’s acquisition of Two Hat Software in October 2021.
Proactive enforcements involve Xbox’s use of technology to identify policy violations before players report them. The platform holder has been using content moderation technologies for years to help identify things like inappropriate text, images and videos shared by players on Xbox. However, advances in this space – plus the expertise of Two Hat – means some of this process is now automated.
This has also resulted in a major uptick in the number of inauthentic accounts removed, hence the higher proportion of account suspensions (which range from three-day, to seven-day, to 14-day, to permanent suspension).
When looking at enforcements by policy area, these were the most common reasons:
Cheating and/or inauthentic accounts: 4.33 million
Profanity: 1.05 million
Adult sexual content: 814,000
Harassment or bullying: 759,000
Hate speech: 211,000
The creation of inauthentic accounts – defined as “throwaway accounts commonly used for purposes such as spam, fraud or cheating” – represented 57% of all enforcements for the six-month period.
“We’ve had an inordinate number of [these] accounts hitting our services,” said McCarthy. “The report looks a little less typical than the past couple of years in that we’ve experienced a lot of that activity lately.”
The next largest was profanity, although McCarthy says this can be a tricky category to handle due to the “fine line between healthy and unhealthy competition.” He added: “There’s a lot of communication between users that we need to strike the right balance on.”
Proactive
As a result of automation and the introduction of Two Hat, January to June 2022 saw nine times more proactive moderation than in the same period last year. Xbox shared further comparisons:
Number of proactive enforcements (vs reactive)
January through July 2022: 4.78 million (compared to 2.53 million reactive enforcements)
July through December 2021: 461,000 (2.24 million)
January through June 2021: 533,00 (3.49 million)
July through December 2020: 398,000 (3.94 million)
While the biggest area of proactive enforcement was suspending inauthentic accounts, Microsoft said it also had an impact on the following areas during H1 2022:
Adult sexual content: 199,000 proactive enforcements
Fraud: 87,000
Harassment or bullying: 54,000
Profanity: 46,000
Reactive
Number of reports from players (H1 2022): 33 million
This is down 36% year-on-year, and 21% when compared to H2 2021. Xbox also says player reports have been declining since H2 2020, when there were 59.65 million – and that’s in the face of a growing player base.
“There is no single reason for why player reporting is decreasing,” McCarty says. “Reporting is a critical component in our safety approach. We want to ensure that players understand how to report properly and feel comfortable doing so, knowing we are working to review and take appropriate action.
“We also acknowledge there could be instances where players aren’t reporting or don’t understand how reporting works, and we hope that in releasing this report, players will understand more about what we do so they can better understand this process.”
The player reports made during the first six months of 2022 can be split into three categories:
Communications (e.g. platform messages, comment on an activity feed post): 15.23 million (46% of all player reports)
Conduct (e.g. cheating, unsporting behaviour): 14.16 million (43%)
User-generated content (e.g. gamertags, club logos, uploaded screenshots): 3.68 million (11%)
On the subject of player actions, suspended users are able to appeal against Xbox’s enforcements. During January through June 2022, Xbox dealt with over 151,000 appeals; only 6% resulted in the account being reinstated.
“We don’t always get it right or have the full context,” McCarthy admits. “[But] in the 94% of the ones we don’t reverse, it is typically because a player did something that went against our Community Standards.”
McCarthy says that Xbox believes account suspension is an effective tool in discouraging toxic behaviour among users, adding: “Previous activity and suspension does play a role in how we assign the severity and size of any present and future enforcement actions. Players who consistently violate our Community Standards are not welcome on Xbox. But we do recognize that, when players are educated on community expectations when they play, and are aware of the consequences for actions that go against our Community Standards, it is a corrective moment for the majority of players that helps determine how they show up in future community interactions.”
Xbox will be releasing a new Transparency Report every six months, and will review the Xbox Community Standards around the same time. While other companies prefer annual reports, McCarthy does not believe the decision to release them more frequently increases the pressure to show progress.
“Being open and transparent can be difficult, but it’s important that players know more about what we do to keep them safe,” he says. “We see incredible value in being introspective, assessing what our processes are today, the data and insights across our technology, and listening to feedback. All of this is with the intention of learning and improving.”