Skip to content

Radical extremists focus on exploiting online gaming communities through Discord, Twitch, and Steam platforms.

Modified: 'Fan bases of aggressive first-person shooter games are prime candidates'

Research reveals online radical groups focusing attacks on gamers through Discord, Twitch, and...
Research reveals online radical groups focusing attacks on gamers through Discord, Twitch, and Steam platforms

Radical extremists focus on exploiting online gaming communities through Discord, Twitch, and Steam platforms.

Gaming-adjacent platforms such as Discord, Twitch, and Steam are becoming increasingly popular avenues for extremist groups to recruit and radicalize new followers. These platforms, with their interactive and less regulated nature, offer a fertile ground for extremists to propagate far-right ideologies, hate speech, misogyny, racism, and conspiracy theories.

Extremist actors exploit the chat, voice, and livestream features of these platforms, especially targeting young, impressionable users who engage with "hyper-masculine" gaming communities. Real-time voice chat and private channels on Discord allow extremists to build rapport in an environment with limited moderation. Livestreams and meme sharing on Twitch and Steam communities facilitate the sharing of extremist propaganda and normalize hateful content.

Online gaming culture, often centered around first-person shooters and competitive play, provides a fertile ground for recruiting by appealing to shared interests in gaming and "hyper-masculine" identities. Extremists have also adapted games or created game-style propaganda to attract recruits and disseminate extremist narratives.

The challenges stem from relatively weak moderation, both human and AI-powered, on these platforms compared to mainstream social media. The difficulty of monitoring and regulating private chats and voice channels further complicates the issue. The content sometimes remains technically lawful yet harmful, making enforcement by platforms more difficult.

Addressing this issue requires enhanced moderation, policy reform, cross-sector cooperation, and user awareness initiatives. Strengthening moderation capabilities using improved AI tools combined with well-trained human moderators is crucial to detect extremist content and accounts earlier and more reliably. Updating platform policies to better regulate harmful but legally ambiguous content is also necessary.

Collaborative efforts between platforms, law enforcement, and organizations working on preventing violent extremism are essential to develop effective detection, reporting, and intervention mechanisms. Educating users, especially younger gamers, on identifying and reporting extremist behavior and propaganda is also crucial. Encouraging platform transparency and regular assessment of how content moderation policies impact the spread of extremist material is equally important.

Recent research by the Psychological Defence Research Institute at Lund University identified six ways extremists are using games and related technology, including reframing reality, projecting authority, psychographic targeting, hacking and phishing, social propaganda, and interactive propaganda.

Discord, one of the platforms under scrutiny, invests heavily in safety efforts to protect users. They have advanced safety tools, proactive detection and moderation systems, and continuously improve these measures. Discord takes decisive actions when violations of their policies occur, including removing content, banning bad actors, shutting down violative servers, and proactively reporting violations to law enforcement.

However, concerns have been raised about Discord's role in extremist recruitment, particularly by Dr. William Allchorn, a senior research fellow at Anglia Ruskin University. As of the initial article publication, Discord's statement regarding these concerns was not available. Following the publication of this article, Discord's statement was sent to a website, reiterating their commitment to safety and combating extremism on their platform.

The UK Online Safety Act is being criticized for not being up to scratch on misinformation, according to MPs. Whether the Act covers misinformation is uncertain, according to some reports.

In summary, extremists exploit the interactive and less regulated nature of gaming-adjacent platforms to recruit and radicalize new followers. Addressing this issue requires enhanced moderation, policy reform, cross-sector cooperation, and user awareness initiatives.

  1. AI tools and human moderators must be strengthened to detect extremist content and accounts earlier on gaming-adjacent platforms such as Discord, Twitch, and Steam.
  2. Collaboration between platforms, law enforcement, and organizations working on preventing violent extremism is essential to develop effective detection, reporting, and intervention mechanisms for social-media, entertainment, general-news, crime-and-justice, and AI technology.
  3. The UK Online Safety Act needs reconsideration regarding its coverage of misinformation, particularly as it pertains to gaming-adjacent platforms and their role in the spread of extremist propaganda.

Read also:

    Latest