Jayakartapos, The increasing number of QAnon-linked candidates running for political office may further normalize conspiracies and will complicate tech company efforts to curb harmful online behavior. QAnon, the right-wing conspiracy theory that has linked a cabal of wealthy and well-connected elites to the exploitation of children, continues to spin a sinister web of deceit that has rendered its red-pilled believers numb to reality. The most dedicated QAnon adherents have conducted attacks against perceived enemies and the threat posed by the movement has attracted the attention of the Federal Bureau of Investigation, which referred to QAnon’s most ardent followers as a potential domestic terrorism threat. Many national security analysts have grown concerned that this threat could continue to grow in the lead up to the November 2020 election, resulting in a spike in violence. Since Q’s first post in October 2017, what was once an isolated fringe movement confined to the far-reaching crevices of the internet has proliferated, resulting in new followers, acts of terrorism, and the normalization of its message in certain circles as politicians begin sprinkling in conspiracy theories into their campaign rhetoric. At the same time, QAnon has begun to spread overseas. The combination of treating QAnon’s followers as harmless tin-hat wearing internet trolls and the ubiquity of the internet has allowed for the QAnon movement to metastasize.
QAnon’s conspiracy theories have spread over mainstream social media platforms. Private and public Facebook groups and influential figures often tout QAnon’s range of interwoven, but seemingly dissonant conspiracies. During the COVID-19 pandemic, QAnon’s messaging likely attracted many new followers. When the documentary ‘Plandemic’ went viral in May on YouTube and was shared by thousands of Facebook and other social media users, QAnon sympathizers were instrumental to the spread of the documentary’s conspiracies. QAnon zealots have also been able to spread their messages over Twitter and TikTok effectively via an effective use of hashtags and rhetoric the combination of which has served as attractive clickbait – especially for the younger generation who rely on social media for news, instead of traditional news sources, like newspapers. For example, QAnon’s forerunner, the 2016 PizzaGate theory, began trending again on TikTok, in part due to the amplification of the message by QAnon’s followers. Amplification of QAnon’s messages, and the second wave of PizzaGate conspiracies, have also spread because some tech companies overly rely on automated recommendation systems that in some cases facilitate radicalization. With few exceptions, Silicon Valley has been too slow to curtail the spread of QAnon’s false prophecies. Reddit was at the forefront in curbing Q’s expansion when the platform started removing QAnon-related content in 2018. Other companies were much slower to respond. In July 2020, Twitter removed 7,000 accounts associated with QAnon and another 150,000 accounts were hidden from trends and searches. TikTok also recently acted by blocking QAnon-related hashtags and Facebook announced it would follow Twitter’s lead and take additional actions against QAnon users. Finally, YouTube has also started providing a context box underneath QAnon videos that links to a Wikipedia page that explains that QAnon is a far-right conspiracy theory.
While recent tech company action against QAnon is worthwhile, fact checks, content-removal, and account deletions have occurred too late to stem the normalization of QAnon’s unfounded ‘theories.’ In July, Vice News documented the scope of the movement’s spread, citing a QAnon festival in Finland and legions of followers from an eclectic collection of countries ranging from the United Kingdom, Canada, France, and Germany to more far-flung and less likely places like Iran, Japan, and Russia. QAnon’s resonance overseas in 2020 has grown in large part due to COVID-19, which has created a fertile landscape where captive audiences stuck at home with only the Internet as a lifeline to the outside world. Cults, which QAnon has arguably developed into, often prey upon the gullible, the powerless, or those in a quixotic search of for answers. QAnon’s theories and spinoffs touted by Q’s decoders that contend that globalists are to blame for nefarious actions, whether abusing children or creating COVID-19 intentionally, provides a simple explanation for the complex and often unexplainable.
As QAnon’s contours assume global dimensions, one indicator of the normalization of the movement’s ideas in the United States is the possible ascension to elected political office by QAnon supporters. According to Alex Kaplan of Media Matters, as of July 30, there were 69 current or former U.S. Congressional candidates who support (or give credence to) QAnon. Whether any QAnon-connected individual can win in November is unclear, but the growing political influence of the movement and its supporters may complicate efforts to stall the movement’s trajectory, especially for social media companies that have been, in many cases for good reason, reticent to curb elected politicians online posts. Finally, QAnon supporters have been retweeted or have had their memes circulated by President Trump and his sons (Eric and Donald) more than 100 times. In doing this the Trump family, whether intentionally or inadvertently, has facilitated the normalization of QAnon’s conspiracies (TSC)