TikTok’s algorithm pushes misogynistic content to young men, according to a new report.
An investigation by the Observer found that the video-sharing app, which has its headquarters in Culver City, spreads extreme, anti-women videos. Creating a new account posing as an 18-year-old, the Observer found that after users watched videos aimed at men that included discussions of male emotions and podcasts hosted by men, TikTok's algorithm began suggesting increasingly misogynistic content.
Many videos featured Andrew Tate, a kickboxer turned internet star who has been criticized for misogynistic content. Some of Tate's videos blamed feminism for men’s misery, claimed men have no power and praised his girlfriend for being “well trained.”
Tate’s videos have been under debate since comedy duo Cody Ko and Noel Miller discussed the controversial social media figure on their podcast. Many of the podcast's fans expressed disappointment in the duo's choice to provide Tate with a platform without disavowing his views. A new TikTok trend urges people to see if people they know follow Tate on Instagram. Other popular social media stars, such as Hasan Piker (known as HasanAbi), have supported the discussion.
The Observer also found that TikTok recommended content from right-wing psychologist Jordan Peterson and men’s rights activists. It also promoted anti-mask videos.
TikTok is not the only social media platform that has been found to promote misogynistic content to some users. Instagram came under fire for users attacking female influencers and Twitter hosted anti-Amber Heard content during her defamation trial against her ex-husband, Johnny Depp.
Earlier this year, TikTok updated its community guidelines to better protect its users and clarified that videos featuring misogyny and other hateful ideologies would not be promoted on its "for you" page.
“Misogyny and other hateful ideologies and behaviors are not tolerated on TikTok, and we are working to review this content and take action against violations of our guidelines,” a TikTok spokesperson said. “We continually look to strengthen our policies and enforcement strategies, including adding more safeguards to our recommendation system.”
TikTok’s algorithm has come under scrutiny even as more social media platforms try to imitate it. A recent report found it can quickly spread harmful ideologies, such as white supremacy. Currently, the company faces a lawsuit alleging its algorithm directs dangerous videos to children. The company’s content moderators—who review potentially misogynistic content—have also spoken out against unfair working conditions.