Call of Duty enlists AI to eavesdrop on voice chat and help ban toxic … – PC Gamer

When you purchase through links on our site, we may earn an affiliate commission. Here’s how it works.
There’s a new AI-powered sheriff in town, and its name is ToxMod.
Call of Duty is joining the growing number of online games combatting toxicity by listening to in-game voice chat, and it’s using AI to assist the process. Activision announced a partnership with AI outfit Modulate to integrate its proprietary voice moderation tool—ToxMod—into Modern Warfare 2, Warzone 2, and the upcoming Modern Warfare 3.
Activision says ToxMod, which begins beta testing in North American servers today, is able to “identify in real-time and enforce against toxic speech—including hate speech, discriminatory language, harassment and more.”
Modulate describes ToxMod as “the only proactive voice chat moderation solution purpose-built for games.” While the official website lists a few games ToxMod is already being used in (mostly small VR games like Rec Room), Call of Duty’s hundreds of thousands of daily players will likely represent the largest deployment of the tool to date.
Call of Duty’s ToxMod AI will not have free rein to issue player bans. A voice chat moderation Q&A published today specifies that the AI’s only job is to observe and report, not punish.
“Call of Duty’s Voice Chat Moderation system only submits reports about toxic behavior, categorized by its type of behavior and a rated level of severity based on an evolving model,” the answer reads. “Activision determines how it will enforce voice chat moderation violations.”
So while voice chat complaints against you will, in theory, be judged by a human before any action is taken, ToxMod looks at more than just keywords when flagging potential offenses. Modulate says its tool is unique for its ability to analyze tone and intent in speech to determine what is and isn’t toxic. If you’re naturally curious how that’s achieved, you won’t find a crystal-clear answer but you will find a lot of impressive-sounding claims (as we’re used to from AI companies). modulate toxmodThe company says its language model has put in the hours listening to speech from people with a variety of backgrounds and can accurately distinguish between malice and friendly riffing. Interestingly, Modulate’s ethics policy states ToxMod “does not detect or identify the ethnicity of individual speakers,” but it does “listen to conversational cues to determine how others in the conversation are reacting to the use of [certain] terms.”
Terms like the n-word: “While the n-word is typically considered a vile slur, many players who identify as black or brown have reclaimed it and use it positively within their communities… If someone says the n-word and clearly offends others in the chat, that will be rated much more severely than what appears to be reclaimed usage that is incorporated naturally into a conversation.”
Modulate also offers the example of harmful speech toward kids. “For instance, if we detect a prepubescent speaker in a chat, we might rate certain kinds of offenses more severely due to the risk to the child,” the site reads.
In recent months, ToxMod’s flagging categories have gotten even more granular. In June, Modulate introduced a “violent radicalization” category to its voice chat moderation that can flag “terms and phrases relating to white supremacist groups, radicalization, and extremism—in real-time.”
The list of what ToxMod claims to be detecting here includes:
“Using research from groups like ADL, studies like the one conducted by NYU, current thought leadership, and conversations with folks in the gaming industry,” says the company, “we’ve developed the category to identify signals that have a high correlation with extremist movements, even if the language itself isn’t violent. (For example, ‘let’s take this to Discord’ could be innocent, or it could be a recruiting tactic.)”
Modulate is clearly setting its goals high, though for Call of Duty’s purposes, it sounds like ToxMod will simply be the middleman between potential offenders and a human moderation team. While the machinations of AI decision-making are inherently vague, Activision says its enforcement will ultimately abide by Call of Duty’s official Code of Conduct. That’s not dissimilar to how Riot and Blizzard have been handling voice chat moderation in Valorant and Overwatch 2, though Riot has also been gathering voice chat data for over a year to develop its own AI language model.
ToxMod will roll out worldwide in Call of Duty at the launch of Modern Warfare 3 on November 10, starting with English-only moderation and expanding to more languages at a later date.
Sign up to get the best content of the week, and great gaming deals, as picked by the editors.
Morgan has been writing for PC Gamer since 2018, first as a freelancer and currently as a staff writer. He has also appeared on Polygon, Kotaku, Fanbyte, and PCGamesN. Before freelancing, he spent most of high school and all of college writing at small gaming sites that didn’t pay him. He’s very happy to have a real job now. Morgan is a beat writer following the latest and greatest shooters and the communities that play them. He also writes general news, reviews, features, the occasional guide, and bad jokes in Slack. Twist his arm, and he’ll even write about a boring strategy game. Please don’t, though.
September is the biggest gaming month of 2023
Turns out your abyssal tattoos in Baldur’s Gate 3 are the D&D equivalent of accidentally getting ‘egg drop soup’ inscribed in Chinese characters
Here’s what’s free in Cyberpunk 2077’s 2.0 update (car chases, yay), and what requires the $30 Phantom Liberty expansion
By Jeremy Laird
By Katie Wickens
By Jeremy Laird
By Robert Jones
By Harvey Randall
By Joshua Wolens
By Jeremy Laird
By Katie Wickens
By Joshua Wolens
By Harvey Randall
By Katie Wickens
PC Gamer is part of Future US Inc, an international media group and leading digital publisher. Visit our corporate site.
© Future US, Inc. Full 7th Floor, 130 West 42nd Street, New York, NY 10036.

source

Jesse
https://playwithchatgtp.com