Users will soon be able to block certain kinds of videos from their tailored For You Page that they no longer want to see. There will be a setting that filters out videos with words or hashtags you don’t want to appear on your algorithm-based feed. Similar to how to mute keywords and hashtags on Twitter, this new option will cede more content control to users than before. While there was already the option to “dislike” a video on your For Your Page, it’s not a guarantee you won’t see it again. It seems like the filter setting should actually prevent videos on topics you’re not interested in seeing from reaching you. TikTok will also be altering content moderation on its end, making progress on flagging mature content that should not reach younger users. Videos may now show Content Level rankings that would warn or restrict mature content for minors, given TikTok has an accurate understanding of a user’s age. In addition, TikTok is taking measures to identify problematic trends and limit the quantity of videos related to such a trend — specifically content damaging to mental health or causing other kinds of harm — that will appear on someone’s For You Page. One of these kinds of videos alone might be OK, but the idea is that if a user is being fed multiple videos on a problematic topic, it could be troubling to their experience on the app. The Verge (opens in new tab) provided the examples of dieting or depression-based content.
TikTok content moderation — why it’s important
Parents in California recently filed a lawsuit against TikTok claiming the platform was responsible for the unfortunate deaths of two young users. According to the New York Times (opens in new tab), the suit alleges the girls, age 8 and 9, died from participating in a dangerous challenge presented to them on the platform. Now, it should be known TikTok has an age restriction barring users younger than 13 from making accounts. However, it’s possible to lie when setting up a profile. If an impressionable user were to see a singular video of a problematic trend versus multiple videos of that trend, it seems less likely they’ll be persuaded to participate. Of course, if the trend is dangerous altogether, TikTok’s content moderation shouldn’t even let one video reach a teenager (or someone even younger.) Perhaps part of the new content moderation will be using a user’s viewing habits to identify whether they’re actually of age to use TikTok, even if their account information suggests otherwise. Given the exponential scale of TikTok, making changes that keep audiences safe should be a priority. It’s encouraging to learn some adjustments are rolling out, even though some may argue it’s too little, too late. You can read the entire blog post on the incoming TikTok changes here.