In order to eradicate toxicity in the in-game voice chat of Call Of Duty: Modern Warfare 3, Activision has announced that it is using an AI moderation tool.
Working with Modulate, a start-up that developed the advanced AI technology ToxMod, the publisher explained that it will use this tool to “identify in real-time” offenders in the game. ToxMod flags “hate speech, discriminatory language, harassment and more” to the moderation systems that the Call Of Duty anti-toxicity team already has active.
Firstly, Call Of Duty: Modern Warfare 2 and Call Of Duty: Warzone 2.0 will add the AI moderation tool in an “initial beta rollout”. Then, a worldwide launch (bar Asia) of the technology will align with the release of Call Of Duty: Modern Warfare 3 on November 10, for PC, PlayStation 4, PlayStation 5, Xbox One, Xbox Series X and Xbox Series S.
According to Activision, a fifth of players do not reoffend after they are given their first warning for offensive language in the voice or text chat of Modern Warfare 2. Those who are identified for the second time are sanctioned with “account penalties, which include but are not limited to feature restrictions (such as voice and text chat bans) and temporary account restrictions”.
The anti-toxicity team has also accounted for false reporting of vitriolic activity, and the publisher thanked the players for their cooperation.
“This type of commitment to the game and the Community from our players is incredibly important and we are grateful to our community for their efforts in combating disruptive behaviour,” said Activision.
“We ask our Call of Duty players to continue to report any disruptive behaviour they encounter as we work to reduce and limit the impact of disruptive behaviour in Call of Duty,” it concluded.
In other Modern Warfare 3 news, Open Combat Missions offer players the opportunity to customise their strategy when completing their objectives and challenge themselves at the same time.