Nate Valenta, Community Manager for Heroes of the Storm, announced that the game will incorporate new technology to tackle the growing problem of abusive players. The exact details of the technology are unknown, probably to disguise opportunities to bypass its activities.
Nate Valenta claims that the new technology will increase the responses to reports of offensive behavior. This is another step in a direction to subdue the frustrations of players for in the previous week Blizzard had a community address and Reddit AMA to initiate a plan to be more communicative with the fans, a rather unique move from Blizzard.
The toxicity problems that plague Heroes of the Storm seem to be affecting most other Blizzard games as well, though Hearthstone may be the exception since there aren’t many ways to be toxic at all.
Overwatch, on the other hand, has also been dealing with rising toxicity. Jeff Kaplan, the game director for Overwatch, has previously spoken about using A.I (artificial intelligence) to tackle the problem. The idea is to create a program that will be able to detect toxic behavior before it is even reported.
To create a sophisticated machine capable of doing so, Blizzard will have to use deep learning tools to allow the A.I to learn exactly what is and what isn’t toxic behavior and harassment. To teach the A.I about toxicity, Blizzard plans to use the current player reports on abuse in games.
If the A.I can successfully ignore wrongful player reports and accurately detect abusive language, player moderators can then use the much smaller sample of reports and use their judgement to dish out a decision.
It is important to note that most of this is based upon speculation since there is no sure-cut way and detailed information regarding both A.I and the methodology Blizzard plans to use to handle the problems almost every competitive game faces.