However for the most part this usually relies on players reporting other toxic players which isn’t the most efficient of methods, which is why Blizzard could be looking to automate the process. Speaking to Kotaku in an interview, Overwatch’s director Jeff Kaplan revealed that the company has been experimenting with the use of algorithms and machine learning to help spot toxic behavior in players.
According to Kaplan, “We’ve been experimenting with machine learning. We’ve been trying to teach our games what toxic language is, which is kinda fun. The thinking there is you don’t have to wait for a report to determine that something’s toxic. Our goal is to get it so you don’t have to wait for a report to happen.”
Kaplan also hopes that this could eventually get to the point where it can tell what toxic behavior looks like without the need for any verbal or written cues. That being said, Kaplan has stated before that Blizzard spending time addressing toxic behavior in Overwatch is slowing down updates made to the game, so hopefully with machine learning, toxic behavior can be addressed automatically and more efficiently.
Filed in AI (Artificial Intelligence), Blizzard and Overwatch.
. Read more about