Skip to content

Foreign X accounts continue to intimidate British Muslims, having previously instigated riots in 2024.

U.S.-based account X has been named by parliament members as a significant actor in dispersing misinformation aimed at inciting violence before the 2024 elections, prompting further criticism.

Foreign X accounts, linked to the 2024 riots, continue to pose a threat to Muslim communities in...
Foreign X accounts, linked to the 2024 riots, continue to pose a threat to Muslim communities in the United Kingdom.

Foreign X accounts continue to intimidate British Muslims, having previously instigated riots in 2024.

In the digital age, the fight against hate speech and disinformation has become a global concern, particularly in relation to Islamophobia. Despite international calls for action and government enforcement efforts, the issue remains a challenge, especially on certain platforms like Telegram.

Recently, Dame Chi Onwurah, the chair of the Science, Innovation and Technology Committee (SITC), expressed concern over a post that called for violence and made a direct threat to the Prime Minister. The post, which has not been removed, remains active, sparking calls for intervention from the UK's regulatory body, Ofcom.

The issue of disinformation extends beyond individual posts. For instance, the US-based account End Wokeness was identified by MPs as a key player in the spread of disinformation ahead of last summer's violence. End Wokeness has reportedly initiated a hate campaign against Rotherham's first female Muslim mayor and has been linked to the spread of fake news about the identity of the Southport attacker in July 2024.

Telegram, criticised for lax moderation, has allowed far-right extremist groups, such as "Deport Them Now," to organise and spread Islamophobic content. This has led to real-world violence in places like Spain.

Law enforcement agencies are responding to specific incidents, but the response varies across countries. In Australia, despite serious Islamophobic attacks, there have been no dedicated taskforces or new laws addressing Islamophobia at the same level as other forms of hate. Similarly, in the United States, far-right figures continue to propagate anti-Muslim conspiracy theories online.

International bodies, such as the United Nations, are urging online platforms to actively curb hate speech and harassment as part of a broader global call to counter rising anti-Muslim bigotry. Governments are urged to enforce laws and policies that mitigate discrimination and hate crimes.

The UK's Online Safety Act, introduced in October 2023, gives tech companies a duty to take down illegal content and stop it from appearing. Ofcom has the power to fine platforms up to £18m or 10% of their qualifying global revenue. However, the Act does not include measures to counter the algorithmic amplification of legal but harmful content, such as misinformation.

The committee found the legislation to be "woefully inadequate" in addressing the issue of misinformation and social division. Any move to fine platforms or beef up Ofcom's powers could set Starmer's government on a collision course with Donald Trump's administration.

Radio Genoa, an account claiming to be run from Italy, has built up 1.4 million followers on a platform by publishing a steady stream of anti-migrant and anti-Muslim content. The account, like many others, has faced no public measures specific to its actions.

The ongoing struggle against Islamophobic disinformation and hate on social media platforms requires clearer policies, better platform moderation standards, and dedicated legal frameworks. The need for effective action is underscored by the potential for events like the 2024 riots to be repeated if social division and the spread of misinformation is not addressed effectively.

Former equalities minister Anneliese Dodds stated that there are "major regulatory gaps" in how Britain tackles online harm. The fight against hate speech and disinformation is a complex and ongoing challenge, but it is one that must be addressed to protect religious freedoms and human rights.

  1. The fight against hate speech, disinformation, and misinformation on social media platforms, including posts inciting violence, has become a pressing concern, not only limited to Islamophobia but also encompassing areas like entertainment, politics, general news, and crime-and-justice.
  2. The UK's Online Safety Act, while addressing illegal content, fails to counter algorithmic amplification of legal but harmful content such as misinformation, indicating a need for clearer policies, better platform moderation, and dedicated legal frameworks to combat disinformation across various topics like social-media, entertainment, politics, general-news, and crime-and-justice, in order to protect religious freedoms and human rights.

Read also:

    Latest