Digital Landscapes Posed as Perilous Terrain for Activist Materials
In the ever-evolving landscape of social media, TikTok has become a prominent platform for creators worldwide. However, recent findings suggest that its automated moderation and content recommendation system may inadvertently or systematically limit the reach of feminist, LGBTQ+, and anti-racist content.
One of the most contentious issues surrounding TikTok is the concept of shadowbanning, a discrete sidelining that reduces a video's reach but is difficult to quantify. Platforms often deny its existence, but it remains a concern for many users, who feel their content is being suppressed without clear explanation.
Researcher Refka Payssan proposes that this invisibility of sensitive content on TikTok could be due to several factors, including biased moderation AI, compliance with authoritarian regimes, and economic imperatives. Payssan's hypotheses are backed by studies indicating that 73% of pro-LGBTQ+ videos were made invisible on TikTok in 2023, and between October and November 2024, more than seven million videos were deleted.
TikTok has faced broader criticism regarding censorship related to political and sensitive topics, with concerns about systematic suppression of minority creators and their content. While TikTok denies such practices, the effect on marginalized and activist content has raised ethical questions.
Activist accounts on TikTok adapt to these challenges by navigating algorithmic constraints, using peer-to-peer sharing and community engagement, diversifying platforms and strategies, and raising awareness about censorship. They modify how they present content to avoid triggering moderation filters, leverage TikTok’s platform structure for spontaneous and decentralized information sharing, cross-post content to other social media sites, use coded language or symbols to evade automated moderation, and collaborate with like-minded creators to amplify reach organically.
Creators on TikTok use a variety of methods, such as music, dance, trends, tutorials, and cooking videos, to get their messages across. However, accounts can be banned for impersonating identities, promoting violence, sexual abuse, human trafficking, or real acts of torture. TikTok's community principles prioritize defending human rights and include fostering freedom of expression, respecting local context, supporting inclusion, and upholding equality.
Despite these principles, political or activist content is not banned on TikTok, but there are limitations. For instance, representations of non-consensual sexual acts, sexual abuse on images, or physical abuse (domestic violence) are forbidden. However, implicit testimonies from survivors may be allowed.
In essence, TikTok’s algorithmic moderation, while intended to maintain community standards, can inadvertently or systematically limit the reach of feminist, LGBTQ+, and anti-racist content by downranking or hiding it. Activists respond by creatively adapting their communication strategies and leveraging community networks to continue amplifying marginalized voices on the platform. This practice, coined "algorithmic activism" by Puerto Rican political anthropologist Yarimar Bonilla, is a testament to the resilience and innovation of marginalized communities in the face of digital censorship.
Social media entertainment on TikTok often faces challenges due to the platform's content moderation system, which may inadvertently or systematically limit the reach of feminist, LGBTQ+, and anti-racist content. Activists on TikTok adapt to these issues by utilizing creative communication strategies, such as peer-to-peer sharing, community engagement, and algorithmic activism, to amplify marginalized voices and continue addressing critical social issues.