Call of Duty is going to be trying a new approach to tackling toxicity with the help of AI moderation for voice chat. In a new report on the official Call of Duty blog, Activision details its collaboration with a company called Modulate for this new initiative.

It’s the next step in Activision’s ongoing battle to curb the toxic nature of chat coming from certain players. Using a new tool from Modulate called ToxMod, Activision plans to moderate voice chat in real-time. The tool will help Activision’s Call of Duty staff listen for things like hate speech, discriminatory language, and harassment so proper action can be taken against offenders. Which of course includes the possibility of a ban from playing call of Duty or other online Activision games.

ToxMod’s AI moderation will be in Call of Duty later this year

Activision says it started rolling out the beta version of this new software tool in its existing games on August 30. This includes Call of Duty: Modern Warfare II and Call of Duty: Warzone. Later this year though Activision says it’ll be full rolled out for Call of Duty: Modern Warfare III. That game launches on November 10, so Activision will have a few months to iron out any kinks.

In the beginning Activision says support will only be rolling out in English. But it plans to expand that to other languages not long after. That being said the publisher doesn’t mention when or what languages it’ll be focusing on next. According to Modulate’s website, the ToxMod tool analyzes a few different factors to aid in its moderation. Considering things like voice tone, perceived intention, and context. It then “escalates” the most toxic conversations so moderators can take action.

See also  The latest Android app development trends for July 2023

Toxic chat and voice chat has been an ongoing issue in Call of Duty games for many years. So this should hopefully help tone things down a bit. With the software now rolling out in beta, it’ll be interesting to see how things shift.

Source link