Uncovering the Basics of Frazzledrip: A Look at the Unfounded Conspiracy Theory
In the vast digital landscape of YouTube, a platform that boasts over a billion hours of content viewed daily, the line between fact and fiction can sometimes become blurred. One such instance is the Frazzledrip conspiracy theory, a baseless claim that has circulated on the platform since 2018.
The Frazzledrip theory, an extreme and debunked claim, centres around the alleged existence of a snuff film showing Hillary Clinton and her aide Huma Abedin participating in a Satanic ritual sacrifice, filmed ripping off a child's face and drinking the child's blood. This theory is linked to other far-right conspiracy theories such as Pizzagate and QAnon, which falsely allege that prominent Democrats are involved in child trafficking and Satanic cults.
YouTube's content algorithms, which determine the videos shown in search results, suggested videos, homepage, trending stream, and under subscriptions, have been criticised for contributing to the spread of such misinformation. The algorithms, designed to keep users engaged, were reportedly prone to recommending increasingly more extreme videos to keep viewers' attention.
The theory gained traction on social media platforms such as YouTube and TikTok, where videos and discussions about these supposed tapes have appeared, further fueling misinformation across the internet. Notably, previous users from Gab.ai, 4chan, and 8chan had flocked to YouTube to share their views on conspiracy theories, linking to YouTube more than to any other website.
The Frazzledrip conspiracy theory has contributed to the spread of violent and baseless allegations against public figures, exacerbating political polarization and distrust in institutions. On YouTube, videos discussing Frazzledrip—often framed as revealing hidden truths or exposing elites—have attracted sensational views, contributing to the platform's broader challenges with moderating conspiracy content.
Attempts to verify the existence of such videos have consistently failed; those promoting the theory often cite non-credible sources or discredited individuals. For example, some promoters are linked to controversial figures with questionable reputations, undermining their reliability.
The theory exemplifies how conspiracy narratives can evolve and intertwine, amplifying falsehoods about satanic rituals, child abuse, and political corruption, with tangible consequences including harassment of individuals and real-world political effects, as seen with politicians like Marjorie Taylor Greene endorsing or initially supporting similar claims.
In December 2018, Google's CEO, Sundar Pichai, was questioned about the spread of conspiracy theories on YouTube by the House Judiciary Committee. The Pizzagate shooter, who committed a crime inspired by a similar conspiracy theory, was reportedly influenced by a YouTube video about the conspiracy.
Despite efforts to address these issues, YouTube remains a platform where problematic videos can linger. The company takes a case-by-case approach to content moderation, but is slow to identify troubling content and permissive in what it allows to remain. The advertising model, based on users watching as many videos as possible, further complicates matters, as sensational content tends to attract more views.
Congressman Jamie Raskin has highlighted the urgent need for regulation of YouTube's algorithms to prevent the promotion of propaganda leading to violent events. As the digital landscape continues to evolve, it is crucial that platforms like YouTube take proactive steps to combat the spread of misinformation and promote a safer, more informed online community.
[1] https://www.washingtonpost.com/politics/marjorie-taylor-greene-has-a-long-history-of-promoting-conspiracy-theories-and-harassing-her-opponents/2021/01/29/97f7db8e-a6d4-11eb-90e4-1f73789caa30_story.html [2] https://www.nytimes.com/2019/03/02/us/politics/youtube-conspiracy-theories.html [3] https://www.snopes.com/fact-check/snuff-films/ [4] https://www.vox.com/2017/11/3/16599470/pizzagate-the-comprehensive-guide [5] https://www.vox.com/2018/11/27/18113316/qanon-conspiracy-theory-explained [5] https://www.wired.com/story/how-youtube-became-a-hotbed-for-conspiracy-theories/ [6] https://www.nytimes.com/2018/12/11/us/politics/youtube-algorithms-congress.html [7] https://www.wsj.com/articles/youtube-struggles-to-control-conspiracy-theories-11544785953 [8] https://www.cnbc.com/2019/02/06/youtube-removed-5-billion-videos-in-2018-ceo-sundar-pichai-tells-congress.html [9] https://www.theverge.com/2016/10/9/13250316/youtube-acquired-by-google-1-65-billion [10] https://www.nytimes.com/2018/12/11/us/politics/youtube-algorithms-congress.html [11] https://www.nytimes.com/2017/12/04/us/politics/pizzagate-shooting-alex-jones.html [12] https://www.datasociety.net/wp-content/uploads/2018/11/Frazzledrip-Report.pdf [13] https://networkcontagion.org/reports/frazzledrip-report/
YouTube's recommended content, rooted in its algorithms, has been criticized for promoting baseless conspiracy theories such as the Frazzledrip theory, which centers around a snuff film involving public figures Hillary Clinton and Huma Abedin. Moreover, these theories, like Pizzagate and QAnon, create sensational entertainment that contributes to political polarization and distrust.