Top 5 This Week

Related Posts

Activision Blizzard to roll out new Call of Duty voice chat moderation

Activision Blizzard has announced that it will be rolling out new in-game voice chat moderation, which will be implemented in Call of Duty: Modern Warfare 3.

The games firm has partnered with machine learning and AI software developer Modulate in the effort.

It will be using a system that uses Modulate’s ToxMod, which detects harmful speech. While detection will be in real-time, responses to code of conduct violations may take some time to enact.

The new chat moderator will launch into beta in North America today for Call of Duty: Modern Warfare 2 and its Warzone counterpart. A global rollout, excluding Asia, is set for November 10, 2 days after the release of Activision Blizzard’s newest FPS.

“There’s no place for disruptive behavior or harassment in games ever. Tackling disruptive voice chat particularly has long been an extraordinary challenge across gaming,” said Activision chief technology officer Michael Vance.

“With this collaboration, we are now bringing Modulate’s state-of-the-art machine learning technology that can scale in real-time for a global level of enforcement.”

This news follows after the Call of Duty maker has been increasing its efforts against harmful behavior online.

In September last year, it banned 500,000 accounts to curb toxicity in the Call of Duty player base.

Then, in December 2022, the firm announced that it was collaborating with researchers to develop AI to combat in-game abuse.

Popular Articles