Call of Duty, a shooter computer game, has begun utilising AI to keep an eye out for hate speech during online matches.
The moderator tool, which makes use of machine learning technology, can recognise harassment and offensive language in real-time, according to publisher Activision.
According to Michael Vance, chief technical officer of Activision, it will contribute to making the game “a fun, fair, and welcoming experience for all players.”
Online video game toxic voice chat has long been an issue, particularly for women and underrepresented groups.
Activision claimed that one million users had already had their communications privileges restricted thanks to its current technologies, which include the capability for gamers to complain to other players and the automatic monitoring of text chat and inappropriate usernames.
The code of conduct for Call of Duty prohibits bullying and harassment, as well as disparaging remarks about race, sexual orientation, gender identity, age, culture, religion, and nation of origin.
When the following installment, Modern Warfare III, debuts on November 10, a complete rollout will start.