PRESS RELEASE

from Modulate, Inc.

Modulate Unleashes a Game-Changer for Player Safety: ToxMod's New Player Risk Categories

SOMERVILLE, MA / ACCESSWIRE / August 21, 2023 / Modulate, creators of purpose-built, AI-driven voice technology that improves the health and safety of online gaming communities, today announced a revolutionary expansion to the scoring capabilities of its ToxMod voice chat moderation software, with the activation of its first two Player Risk Categories: violent radicalization and child grooming.

Modulate, Inc., Tuesday, August 22, 2023, Press release picture

Until now, ToxMod's detection has focused on what Modulate calls "Utterance Risk Categories," which detect individual instances of toxic speech in categories including sexual harassment, hate speech, bullying categories, and more. In detecting Utterance Risk, ToxMod leverages voice-aware AI to analyze emotion, nuance, meaning, and more but stays focused on a single interaction of interest. In contrast, Modulate's new violent radicalization and child grooming Player Risk Categories analyze patterns of behavior over time to identify players who are exploiting, manipulating, or grooming victims over a more prolonged period.

Not only do these two new Player Risk Categories help studios pinpoint some of the most harmful behaviors on their platform, they also help identify the individuals whose history of behavior indicates true malicious and manipulative intent. For example, the violent radicalization Player Risk Category allows moderators to differentiate between a player using extremist language simply to get a rise out of others versus a player who consistently uses extremist language in a potential attempt to recruit or radicalize others. Or if a player hasn't brought up anything overtly sexual with an underage player yet, but is consistently asking a number of suspicious or suggestive questions, ToxMod's child grooming Player Risk Category can now detect that as well.

"At Modulate, we have always been driven by a deep commitment to player safety and inclusivity in online gaming," said Mike Pappas, CEO and co-founder of Modulate. "These Player Risk Categories are the culmination of years of work with top studios, researchers, and of course, our own talented team to ensure we can empower studios to spot those rare but insidious users who are looking to take advantage of the rest of their player base."

With the addition of the violent radicalization and child grooming Player Risk Categories, ToxMod continues to lead the way on voice safety for online platforms. Player Risk Categories join a host of other recently launched features aimed at giving moderator teams of all sizes the tools needed to quickly, accurately, and proactively prevent toxicity in games, including proximity voice chat compatibility, extremist language utterance detection, and player report correlation.

About ToxMod

ToxMod is gaming's only proactive, voice-native moderation solution. Built on advanced machine learning technology and designed with player safety and privacy in mind, ToxMod triages voice chat to flag bad behavior, analyzes the nuances of each conversation to determine toxicity, and enables moderators to respond quickly to each incident by supplying relevant and accurate context. In contrast to reactive reporting systems, which rely on players to make the effort to report bad behavior, ToxMod is the only voice moderation solution in games today that enables studios to respond proactively to toxic behavior and prevent harm from escalating.

About Modulate

Modulate builds intelligent voice technology that combats online toxicity and elevates the health and safety of online communities. ToxMod, Modulate's proactive voice moderation platform, empowers community teams to make informed decisions and protect players from harassment, toxic behavior, or more insidious harms - without relying on ineffective player reports. ToxMod is the only voice-native moderation solution available today and has processed millions of hours of audio for AAA game studios, indie developers, and console platforms alike. Modulate's advanced machine learning frameworks have helped customers protect tens of millions of players against online toxicity to ensure safe and inclusive spaces for everyone.

Visit Modulate at https://www.modulate.ai to learn more, and follow Modulate on LinkedIn and Twitter.

Press Contact

Mike Tom
Modulate
mike.tom@modulate.ai

SOURCE: Modulate, Inc.



View source version on accesswire.com:
https://www.accesswire.com/774897/Modulate-Unleashes-a-Game-Changer-for-Player-Safety-ToxMods-New-Player-Risk-Categories

See all Modulate, Inc. news