Movies, music and video games have long received content ratings to shield kids from mature media. Films featuring sex scenes or gory violence are rated “R,” while albums full of curse words are slapped with the “Parental Advisory” label.
Nothing like that exists in the Wild West of user-generated social media. But TikTok on Wednesday said it is building something similar: a new system to organize content based on thematic maturity. In the coming weeks, the Culver City-based company will roll out an early version, with the goal of preventing “overtly mature themes” from reaching teens. TikTok is calling it “Content Levels.”
“Many people will be familiar with similar systems from their use in the film industry, television, or gaming and we are creating with these in mind while also knowing we need to develop an approach unique to TikTok,” Cormac Keenan, TikTok’s head of Trust and Safety, wrote in a blog post.
The company said it will assign videos a “maturity score” when it detects content that has "mature or complex themes." As an example, TikTok said frightening or “intense” fictional scenes could receive a maturity score.
That will help block people under the age of 18 from viewing those videos, according to TikTok. The firm shared screenshots showing “age protected” posts flagged as “unavailable” to younger users. For now, the social media giant said it is focused on “safeguarding the teen experience,” but it eventually plans to offer more detailed content filtering options for all users.
A screenshot showing an "unavailable" post under TikTok's new Content Levels system.
Image courtesy of TikTok
TikTok’s new Content Levels come as social media platforms face scrutiny over how their apps can be harmful to kids. Federal lawmakers in Washington have grilled tech executives about child safety, while state attorneys general are investigating social media giants over how their design, operations and promotional features could be bad for kids. News reports and lawsuits have said TikTok has fed teens videos depicting eating disorders, dangerous viral “challenges” and other damaging content.
The company has already taken some steps to separate content for teens and adults. TikTok is testing a new setting to let users restrict livestreams to viewers who are 18 and older. The company also updated content rules aimed at combating harmful content, such as preventing viral hoaxes, shielding the LGBTQ community from harassment and removing videos promoting unhealthy eating.
TikTok’s new Content Levels come as social media platforms face scrutiny over how their apps can be harmful to kids. Federal lawmakers in Washington have grilled tech executives about child safety, while state attorneys general are investigating social media giants over how their design, operations and promotional features could be bad for kids. News reports and lawsuits have said TikTok has fed teens videos depicting eating disorders, dangerous viral “challenges” and other damaging content.
The company has already taken some steps to separate content for teens and adults. TikTok is testing a new setting to let users restrict livestreams to viewers who are 18 and older. The company also updated content rules aimed at combating harmful content, such as preventing viral hoaxes, shielding the LGBTQ community from harassment and removing videos promoting unhealthy eating.
In addition to the forthcoming maturity scores, TikTok announced Wednesday that it is rolling out a tool for people to filter out videos with words or hashtags they don't want to see in their feeds. The company said it has also worked to avoid flooding users with similar videos on topics that could be problematic when seen repeatedly, such as dieting, sadness and other well-being issues.
A TikTok spokesperson did not detail what the company’s guidelines for maturity scores will look like, such as whether videos containing violence or profanity will be automatically age-restricted, for example. TikTok users won’t be able to appeal their videos’ maturity scores in the first version of Content Levels, the spokesperson added. That could upset some creators since such restrictions would presumably limit their virality. The TikTok spokesperson said the firm will listen to feedback over the coming months before making adjustments.
But the biggest question of all may be how effective Content Levels will actually be at shielding kids from mature content. Despite the best efforts of parents, plenty of kids still find a way to watch “R” rated movies and play “M” rated video games. Teens will likely try to do the same on TikTok.
- TikTok 'Blackout Challenge' is the Focus of a New Lawsuit - dot.LA ›
- TikTok Restricts Who Can View Sexually Explicit Content - dot.LA ›