As social media platforms finally take to cracking down on conspiracy theories, right-wing propaganda has found a new home. Conspiracy theorists are flooding TikTok, a popular social media application known for its short videos, with misinformation and alt-right propaganda.
While this seems harmless as propaganda is not new to social media, the issue is these videos are making their way to a number of audiences because of TikTok’s recommendation algorithm. The algorism encourages users to follow accounts that are in their area or similar to their interests, by pushing multiple conspiracy theory accounts, TikTok’s algorithm is spreading extremist misinformation at a rapid rate, according to Media Matters. As a result, far-right conspiracy theories are creating massive communities on and offline.
The videos are mostly known on the platform as “ConspiracyTok” come from a community that regularly discusses conspiracy theories. According to Media Matters, while some accounts are dedicated to theories of why the earth is flat others are more harmful and spread misinformation about cultures and identities, including COVID-19. That in itself is harmful to the country specifically, Asian Americans who have been discriminated against and targeted due to these theories, however as TikTok’s algorithm promotes these videos more harm is done as misinformation reaches broader and often more vulnerable audiences.
A majority of TikTok users are GenZ users that are subject to influence from social media, as algorithms target youth members misinformation can have dire consequences. A fact it seems many alt-right TikTok users are taking advantage of. It only takes one video for a person’s entire feed to be filled with “ConspiracyTok.” The way TikTok’s account recommendation algorithm works is individual users are recommended to one another by not only distance but potential interest.
Meaning if you accidentally even stumble on one video you most likely will end up seeing more in the future. Additionally, if a user follows someone they are more likely to get recommendations of similar accounts. This has caused the massive spread of not only anti-vax misinformation but, QAnon-related theories, COVID-19 denial, and anti-Semitic conspiracy theories.
Of course, TikTok is working to ban and has banned many alt-right users from spreading misinformation but the issue is that this misinformation is not always easy to catch.
According to Media Matters, many conspiracy theorists pose as harmless users by posting a variety of content. This prevents them from being flagged as not all their content is controversial. For example, “Conscious Content” is an account with over 11,300 followers, whose bio reads: “Learn and inspire!” With videos about TV shows and other random information, one would never assume that this account belonged to a conspiracy theorist, but a deeper dive shows that the user suggests conspiracies such as believing that Jeffery Epstein was an Israeli spy.
This is not a sole example, other users have similar patterns of camouflage, Media Matters reported. As TikTok and other social media platforms crack down on alt-right individuals, they are coming up with more unique ways to push their agenda.
Under the guise of anything, alt-right community members are spreading misinformation and pushing conspiracy theories on not only the government but health and other issues. An easy way to find these users is by noting the hashtags they commonly use on videos, but again the issue comes down to these users cleverly diversifying their feeds to include different types of content and different tags as well. Doing so not only protects them from being tagged as a conspiracy account but also allows them to reach a broader audience.
If an account describes itself as a lifestyle account, a person who follows similar accounts may unknowingly follow it not realizing it is meant to spread conspiracy theories. In this way slowly when the account does post the theories in its agenda, a user will see them without having had the intention to.
This has been documented throughout TikTok’s history with COVID-19 misinformation especially being a huge documented problem. While at first videos focused on COVID-19 and its spread they now focus on misinformation to do with the vaccine. “This shot will rearrange your DNA. They’ve planned this for one hundred years, it is the mark of the beast,” one user ember_inside_me1 said. The account has over 27,500 followers.
According to research published in 2014 by the University of Chicago, about “half of the American public consistently endorses at least one conspiracy theory.” So one is likely to come across conspiracy theories in one way or another, but the issue is how harmful they can be to the development of some individuals, especially if they are not aware. Studies have found that many youth and young adults get their news from social media, with the spread of misinformation poses a risk of whether or not some understand the difference between fact and conspiracy. Because many conspiracists gain the trust of social media users by depicting a multitude of content, conspiracy theories are more likely to be accepted by them when shared.
By not addressing the issue more thoroughly, TikTok is failing its users. According to Media Matters, while many extremist users on the app, including the ones mentioned in this report, are banned it does not prevent their content from being circulated. TikTok needs to do better, especially to protect its youth. Of course, it is impossible to remove all conspiracy from the view of social media users, more needs to be done to make users aware of what content they are seeing. TikTok’s algorithm does more harm than good and needs to be redeveloped to consider the potential risks it poses.