Now instead of "git gud" in Call of Duty, players have to write "please improve your gaming skills".
Activision Blizzard announced that since the beginning of the year, incidents of toxic behavior in the voice chat of Call of Duty have decreased by 43%. This has been achieved thanks to the implementation of the ToxMod moderation system, which is based on artificial intelligence and was introduced alongside the release of Call of Duty: Modern Warfare III in November of last year.
ToxMod analyzes text transcriptions of voice chat and identifies keywords while considering various factors. The system distinguishes between typical banter and insults by tracking reactions to comments and understanding player emotions. ToxMod also evaluates the age and gender of interlocutors for better context comprehension.
The system cannot independently block users but quickly flags violations for review by human moderators. Activision decides whether to warn the player or mute them. Bans are only applied after repeated offenses. The number of "repeat offenders" in Modern Warfare III and Call of Duty: Warzone decreased by 67% following the system's enhancement in June 2024.
Currently, ToxMod is supported in all regions except Asia. The system moderates voice chat in English, Spanish, and Portuguese. With the release of Black Ops 6, developers will add support for French and German languages.
In August, the language support for text moderation and username verification was expanded from 14 to 20 languages. Community Sift is Call of Duty's partner for text moderation. Since the launch of Modern Warfare III, the system has blocked 45 million messages.
The use of artificial intelligence for moderating player behavior raises fewer controversies than its application for creating game assets. At the end of last year, skins for Modern Warfare III were generated using AI. Amid mass layoffs in the industry, some fear that publishers will seek to replace artists with AI models.
Source: Techspot
Comments (0)
There are no comments for now