Activision announced that it will use Artificial Intelligence to moderate the attitude of Call of Duty users. It will partner with company Modulate to combat “toxic behavior” including discriminatory language, hate speech and more in the video game.
It will do so from the launch of Call of Duty: Modern Warfare III, on November 10, with the implementation of ToxMod technology.
As Activision explains, ToxMod is an Artificial Intelligence that identifies phrases or expressions considered part of “toxic behavior” in real time. “will not be tolerated hate speech, discrimination, sexism and other harmful language”, reiterates the developer company.
“This new initiative will bolster the ongoing moderation systems led by the Call of Duty anti-toxicity team,” Activision notes, “including text-based filtering in 14 languages for in-game text (chat and usernames), as well as a robust in-game player reporting system.”
Activision manages and operates the data generated by ToxMod, and violations of the Call of Duty Code of Conduct are subject to law in the United States.
In the initial beta release this week, the voice chat moderation system will analyze the chat in English. But after the global launch moderation will expand to additional languages that will be released later.
According to developer data, 20% of players did not repeat their behavior after receiving a first warning, but those who did they faced penalties on their account, such as feature (voice and text chat) and temporary restrictions.
“Utilizing new technology, developing critical partnerships, and evolving our methodologies is key to this ongoing commitment,” Activision stresses. “As always, we look forward to working with our community to continue to make Call of Duty fair and fun for everyone.”