Report: Hateful Content Is 'Pervasive' on TikTok
Kristin Snyder is dot.LA's 2022/23 Editorial Fellow. She previously interned with Tiger Oak Media and led the arts section for UCLA's Daily Bruin.
TikTok can help amplify harmful ideologies, according to a new report.
The Global Network on Extremism and Technology found videos promoting anti-Black rhetoric, ethno-nationalism and white supremacy have become “pervasive” on the Culver City-based app. After analyzing the manifesto written by the shooter who killed 10 Black people in Buffalo, the report found videos with similar ideologies on TikTok.
Alternative social media companies have long been scrutinized for fostering alt-right ideologies—4Chan, a platform with anonymous forums, has been linked to the Buffalo shooting as well as a recent shooting in Indiana. According to the report, larger platforms like TikTok are also rife with such content.
“The role of mainstream platforms like TikTok in promoting these violent ideologies is largely overlooked,” researcher Abbie Richards wrote. “The hateful ideologies which motivate attacks like the one in Buffalo are not unique to alternative platforms and message boards.”
The Chinese-owned app’s quick growth and large, young user base have allowed “white supremacist and militant accelerationist content” to spread rapidly. While the report found that the platform primarily amplifies harmful ideologies, memes and videos related to the Buffalo shooter’s ideology are prevalent. Accounts that go viral promoting racist content sometimes suggest their followers join platforms that are not as heavily moderated.
TikTok’s community guidelines ban content promoting “violent acts or extremist organizations or individuals.” The report indicates such content still slips through the cracks. Users can easily create new hashtags to replace banned ones, like #Industrialrevolutionanditsconsequences replacing #TedKaczynski and #TedPilled to promote eco-fascist and anarcho-primitivist ideologies. White supremacists often use the number 14 or “14 words” in their username or bio as a reference to a slogan popular among the group.
TikTok was previously used to promote violence leading up to the January 6 insurrection, and anti-Semitic and racist content posted there often reaches millions of users. In just the first three months of 2021, the app removed 300,000 videos featuring “violent extremism.” A new lawsuit also claims that TikTok’s algorithm is more likely to push violent videos to minority users than white users.
In addition to the findings from this report, TikTok has a content moderation problem—the company currently faces lawsuits from moderators claiming psychological distress and parents who say the algorithm recommended dangerous challenge videos to children who died while attempting them.
- TikTok is The Most Downloaded App - dot.LA ›
- Ex-TikTok Employee Criticizes '996' Workplace Culture - dot.LA ›
- TikTok's Algorithm Spreads Misogynistic Videos - dot.LA ›
Kristin Snyder is dot.LA's 2022/23 Editorial Fellow. She previously interned with Tiger Oak Media and led the arts section for UCLA's Daily Bruin.