Today I learned about Intel AI sliders that filter online abuse

Last month, during its virtual GDC presentation, Intel announced Bleep, a new tool on AI technology that it hopes will reduce the amount of toxicity gamers must experience in voice chat. According to Intel, the application “uses AI to detect and edit audio based on user preferences.” The filter works on incoming sound, acting as an additional moderated level that controls the user on top of what the platform or service already offers.

It’s a noble endeavor, but there’s something darkly funny in Bleep’s interface, which details all the different categories of abuse people can encounter online, paired with sliders to control the amount of abuse users want to hear. Categories range from “Aggression” to “LGBTQ + Hate”, “Misogyny”, “Racism and Xenophobia” and “White Nationalism”. There is even an N-word switch. The Bleep page notes that it has not yet entered the public beta, so all of this may change.

Filters include “Aggression”, “Misogyny” …
Credit: Intel

… and an “N-word” switch.
File: Intel

With most of these categories, Bleep seems to give users a choice: do you want to filter out any, some, most, or all of this offensive language? Like picking from a buffet of toxic slurry from the Internet, Intel’s interface gives players the ability to inject a bit of aggression or name-calling into their online games.

Bleep has been working for a couple of years – PCMag notes that Intel discussed this initiative back at GDC 2019 – and is working with the software with AI Spirit AI moderation experts. But moderating Internet spaces using artificial intelligence is not an easy feat as platforms like Facebook and YouTube have shown. Although automated systems can recognize directly offensive words, they often do not take into account the context and nuances of certain insults and threats. Online toxicity comes in many forms that are constantly evolving, and which even the most advanced AI moderation systems can be difficult to spot.

“While we recognize that solutions like Bleep don’t erase the problem, we believe it’s a step in the right direction, giving players a tool to control their experience,” Intel’s Roger Chandler said during his GDC demonstration. Intel says it hopes to release Bleep later this year and adds that the technology relies on hardware-accelerated AI speech detection, suggesting that the software can rely on Intel hardware to work.

Source