Call of Duty is getting a whole new moderation system

With toxicity in gaming lobbies becoming more and more of a problem, first-person shooters like Call of Duty are now taking action with artificial intelligence (AI).  With the confirmed release date of November 10, 2023, for Modern Warfare 3, Activision is trying to crack down on negative gaming chat.

Using an “in-game voice chat moderation” AI called ToxMod, Activision has partnered with a company to bring a new moderation system to identify behaviors such as hate speech, discrimination, or harassment in real time and ban the players from the game. The devs say that it will focus on “detecting harm within voice chat versus specific keywords” and that “Activision determines how it will enforce voice chat moderation violations.”

Gamers will have the choice of disabling in-game chat if they do not want AI monitoring them.

Modern Warfare 2 and Warzone in North America have ToxMod active starting August 30 and will be globally active starting on November 10 with the release of Modern Warfare 3.  ToxMod is currently only active for English, but other languages will follow at a later date, according to the report.

On November 10, Modern Warfare 3 will be available for the PlayStation 4, PlayStation 5, Xbox One, Xbox series X|S, and the PC.

More From Run Around Tech (RAT)



Please enter your comment!
Please enter your name here

Most Popular