Modern Warfare III to moderate voice chat

Source
Activision
Date
8/30/2023
Summary
Activision will team with Modulate to provide voice chat moderation in Modern Warfare III.

The Mercury
Video game news headlines delivered to your inbox daily. Click here to subscribe.

From Activision:
Modern Warfare III to moderate voice chat

Call of Duty is taking the next leap forward in its commitment to combat toxic and disruptive behavior with in-game voice chat moderation beginning with the launch of Call of Duty: Modern Warfare III this November 10th. Activision will team with Modulate to deliver global real-time voice chat moderation, at-scale, starting with this fall’s upcoming Call of Duty blockbuster.

Call of Duty’s new voice chat moderation system utilizes ToxMod, the AI-Powered voice chat moderation technology from Modulate, to identify in real-time and enforce against toxic speech—including hate speech, discriminatory language, harassment and more. This new development will bolster the ongoing moderation systems led by the Call of Duty anti-toxicity team, which includes text-based filtering across 14 languages for in-game text (chat and usernames) as well as a robust in-game player reporting system.

“There’s no place for disruptive behavior or harassment in games ever. Tackling disruptive voice chat particularly has long been an extraordinary challenge across gaming. With this collaboration, we are now bringing Modulate’s state of the art machine learning technology that can scale in real-time for a global level of enforcement,” said Michael Vance, Chief Technology Officer, Activision. “This is a critical step forward to creating and maintaining a fun, fair and welcoming experience for all players.”

An initial beta rollout of the voice chat moderation technology will begin in North America on August 30 inside the existing games, Call of Duty: Modern Warfare II and Call of Duty: Warzone, to be followed by a full worldwide release (excluding Asia) timed to Modern Warfare III on November 10th. Support will begin in English with additional languages to follow at a later date.

"We're enormously excited to team with Activision to push forward the cutting edge of trust and safety," said Mike Pappas, CEO at Modulate. “This is a big step forward in supporting a player community the size and scale of Call of Duty, and further reinforces Activision’s ongoing commitment to lead in this effort.”

Since the launch of Modern Warfare II, Call of Duty’s existing anti-toxicity moderation has restricted voice and/or text chat to over 1 million accounts detected to have violated the Call of Duty Code of Conduct. Consistently updated text and username filtering technology has established better real-time rejection of harmful language.


Game Hubs:  


 Subscribe to our news feed