Intel Is Using AI to Fight Toxic Voice Chat in Games

SAN FRANCISCO - The rise of online gaming and livestreaming have made gamers and creators more connected than ever. But it's also opened the floodgates for unpleasantly toxic behavior.

Credit: Shutterstock

(Image credit: Shutterstock)

And while there are plenty of manual and automatic tools for moderating foul play in chat rooms and comments sections, curbing toxicity becomes much more difficult when it comes to actual voice chat.

That's where Intel and Spirit AI come in.

The computing giant is teaming with Spirit, which already provides automated text chat moderation via its Ally software, to provide that same degree of filtering for voice chat so that developers can ensure a safe, inclusive environment for their online games.

Intel's voice integration with Spirit's technology is still in early stages, but I saw a few demos that seem promising. Intel started by having Spirit's AI transcribe a produced NPR radio segment, just to show how the technology can capture a clear voice recording to a near word-for-word degree.

Of course, actual in-game chat is rarely crystal clear, nor is it often polite. That's why Intel then showed Spirit tracking a voice clip from a heated League of Legends match, and was able to automatically flag terms like "mentally retarded" that could be construed as hate speech.

While it seems like a ways out before Intel's Spirit voice integration will be available for the masses, the company noted that it will ultimately be up to developers to choose how to implement it.

Kim Pallister, CTO for VR, gaming and esports at Intel, noted that one game maker might choose to issue a warning or ask the harassee if they want the offender banned, while others could set up an auto-ban system the moment foul language is detected.

"Developers will have to learn over time what works best for their game community," said Pallister.

The company did note that there are some obvious challenges to overcome when trying to automatically detect and act on real-world speech, including knowing the difference between an offensive word and a game-specific term. But even if Intel and Spirit AI's detection technology takes a while to arrive for developers, it could eventually go a long way towards making online gaming a better place.

"Somebody has to be rolling up the sleeves and working on these kinds of problems if we’re going to make gaming welcoming to everyone," said Pallister.

Be sure to check out our GDC 2019 hub page for all of the latest gaming news and hands-on impressions straight out of San Francisco.

Michael Andronico

Mike Andronico is Senior Writer at CNNUnderscored. He was formerly Managing Editor at Tom's Guide, where he wrote extensively on gaming, as well as running the show on the news front. When not at work, you can usually catch him playing Street Fighter, devouring Twitch streams and trying to convince people that Hawkeye is the best Avenger.