Videos where the name Derek Chauvin mingles with Bill Gates and Joe Biden are the focus of a new report revealing the murky depths of TikTok’s extremist user base.
TikTok is one of the world’s leading social media platforms, used by its 1.1 billion active users to create engaging, technically impressive short videos.
“Based on a sample of 1,030 videos, this research examined how TikTok is used to promote white supremacist conspiracy theories, produce weapons manufacturing advice, glorify extremists, terrorists, fascists and dictators, direct targeted harassment against minorities and produce content that denies that violent events like genocides ever happened,” said the Institute for Strategic Dialogue (ISD).
30% of the content promoted white supremacy. 24% glorified violent extremists. The report noted that footage of the 2019 Christchurch attack, where 29-year-old Brenton Tarrant murdered 51 people at two mosques, is “easily discoverable” on the app. One trope of this particular stream is videos of the events of the massacre recreated in video games.
Doing this research is not easy. The true number of posts promoting hate speech, and the amount of users interacting with them, remains hidden. TikTok’s API means it is impossible to get large amounts of raw data from the backdoor: “In practice, research on TikTok must therefore be conducted manually in order to comply with the platform’s terms of use.”
In other words, the ISD didn’t have a way into the software, and their methodology was limited to simply spending hours and hours on TikTok.
Moreover, the report notes that 81.5% of the videos it analysed were still live at the time of publication.
The reason? Users employ simple but effective methods to evade moderation:
- Remaking accounts with almost identical usernames.
- Using alternative hashtag spellings only known by those in-the-know.
- Using a profile’s grid layout to spell a message out between posts.
- Smuggling hate-speech beneath innocuous mainstream templates: the report cites a dance to a viral song featuring a Hitler salute and a cap bearing an extremist symbol.
Not only does conformity to the mainstream make the videos harder to detect, but also demonstrates “the key role that TikTok’s own features play in extremist content.”
The platform has a very high engagement rate and time for an app of its kind; the combination of brevity, catchy music, and appealing editing techniques has proven to be incredibly stimulating to the brain chemistry of certain demographics. From the ages of four to fifteen, users spend an average of 80 minutes a day on the app.
It’s addictive because the algorithm is so catered to you. You’re always just one scroll from the video that could change your life.
— A fifteen-year-old TikTok user
As the moderation of extremist views on all social media platforms continues, the ways extremists encode messages into their content to avoid detection will become more advanced.
150 videos featured extremist symbols strategically embedded into videos analysed by the report. Watermarks directing users to white-supremacist Telegram channels were common. One group of posts about the mythic Aryan land Hyperborea blinked frames-length flashes of Joe Biden standing beneath a Sonnenrad, the “black sun” that has become functionally equivalent to the swastika in some white supremacist cliques.
It is possible to envisage a future extremist language entirely encoded in the run-of-the-mill, blinding smile, dance and lip-sync content that has produced some of the platforms most-monied makers.
TikTok’s moderation techniques must become more sophisticated to adapt to this increasing trend if it hopes to have any effect on the dissemination of extremist ideologies on the platform.
Part of this moderation must try to prevent the “rabbit-hole” effect caused by certain content. A report from the U.S. group Media Matters notes that TikTok’s algorithm actively pushes users down extremist rabbit holes.
The algorithm suggests content based on what similar users also enjoy, but some content has a particularly strong effect on suggestions, based on the focus of other users. One teenager I spoke to while writing this article explained to me that “It’s addictive because the algorithm is so catered to you.”
Related Articles: From Incels to Tradwives: Understanding The Spectrum Of Gender And Online Extremism | Preventing Violent Extremism: Bridging the Development and Security Divide | ‘Humanity Beyond Flesh and Bones’: Dr. Gindi on Wanderlust, Human Existence and Social Media
This is common to YouTube, where all you have to do is watch one Jordan Peterson video to become exposed to an entire world of alienated white men.
This algorithmic trick ensures the dissemination of extremist content despite the disruption any moderation might cause to individual accounts and hashtags.
With an estimated 1.1 billion active monthly users, and a format proven to suck people in, TikTok has an enormous weight of responsibility to ensure that the vulnerable young people forming the bulk of their user base do not find themselves channeled in feeds filled with hate-speech.
As the fifteen-year-old explained to me, on TikTok, “you’re always just one scroll away from the video that could change your life.”
Editor’s Note: The opinions expressed here by Impakter.com columnists are their own, not those of Impakter.com. — In the Featured Photo: TikTok logo. Featured Photo Credit: Jayanti Devi.