TikTok is revamping how it moderates content by creators. So far, the platform has more or less stuck to the time-tested method of traditional moderating—See what’s violating the guidelines based on reports policies and take action that’s suitable for the violation. This, however, is not a very streamlined process.
The new account enforcement system was announced earlier this year and has a bunch of stuff that we will unpack in this article so you can be in the know.
Let’s look at what this new system is and how it works.
What is the New Account Enforcement System?
This new system works on “strikes.” An account gets a strike for any content that violates the Community Guidelines of TikTok. This system is mainly concerned with repeat offenders. These are accounts that don’t stop violating a policy or violating policies in a particular category even after multiple temporary bans.
Many creators have voiced legitimate fears that the current system of moderation is not exactly clear. What this means is that creators who unintentionally post policy-violating content are either receiving harsher punishments than they deserve or that they are becoming altogether removed from circulation without any reliable way to “get back.”
The new system aims to solve these issues as well.
By focusing more on repeat offenders, TikTok has shown its dedication to preserving good content creators who unknowingly or very rarely violate the Community Guidelines.
How Does the New Account Enforcement System Work?
The new system works in a straightforward way. Here’s a summary:
- Someone posts a policy-violating TikTok video, comment, or stream.
- A strike is attributed to the account.
- If the account receives a certain number of strikes, it will be banned permanently.
- Each strike will expire in 90 days.
Now, the biggest question is what is this certain number of strikes?
Well, that’s where it gets tricky (for good).
Not all violations are equal. Spreading low-harm spam about a casino app, for example, is not comparable to promoting a hateful ideology or glorifying genocide.
Incidentally, different categories have different thresholds. Even different features can have different thresholds. For example, promoting hateful ideologies on a live stream is a greater evil than promoting them in the comment section of a funny cat video.
A couple of things to note:
- TikTok clarifies that it will issue a permanent ban on the first strike for severe violations. Severe violations, according to the official announcement, include (but are not limited to), “promoting or threatening violence, showing or facilitating child abuse material (CSAM), or showing real-world violence or torture.”
- Accruing only a few strikes in different categories or features individually but a lot, when taken together, will still result in a permanent ban.
How to Keep My Account Protected from the New System?
If you’re not a community guideline offender, you don’t need to be worried. This is a revamping of the moderation system itself, not the policies of what constitutes a violation. If you don’t post content that violates any policy, you’re already in the clear. We’d still recommend going through the guidelines to refresh your information!
Nevertheless, the meat of the discussion is that repeat offenders are the most at-risk this time. But any account should know the basics and keep their account in healthy standing.
It’s a two-step process:
- Check regularly for any strikes.
- If there is a strike, wait 90 days for it to disappear before you post anything that could even remotely be linked to any violation. In other words, become super-safe in your content planning and development for the next 3 months if you receive a strike.
You can check your account’s standing easily. TikTok has rolled out new features in the Security Center section of the app. Here, you’ll find an “Account status” option. You can see if you have any violations on any of your published content in this list.
As usual, you can appeal any such enforcement. TikTok will be more than happy to comply if a strike was issued incorrectly.
With this new system, TikTok will also notify you if you’re hitting the threshold (in other words, if you’re on the verge of being banned permanently) so you can remedy the situation if possible.
Why is TikTok Revamping its Account Moderation System?
TikTok is a very active social media platform. It has already captured 18% of the world’s internet users with 1 billion global active accounts. That’s more than Pinterest and Twitter combined!
As a platform that sees over 20 million video uploads per day and can influence what’s trending on Spotify in a matter of hours, its sheer influence on people’s minds cannot be understated.
And that’s why a platform such as TikTok needs to be safe from content that can hurt people’s sentiments or glorify acts like violence or abuse.
Any popular social platform faces the constant challenge of moderating the content and keeping the community free of the poison of harmful content. It’s a never-ending battle. And as repeat offenders increase in numbers or find new ways to violate policies, TikTok’s system must update to protect its users. And that’s precisely what this step is—A changing paradigm that will make things more transparent and rigorous.
TikTok says that repeat followers have a pattern.
Nearly 90% of repeat offenders violate a policy using the exact same feature and over 75% violate a policy within the same category. With this data, it would be foolish for TikTok to not build a new system that addresses the root of the problem and demonstrates the platform’s commitment to its users’ online safety.
Once this pattern is identified, it’s easy to put together certain systems in place that can dish out strikes automatically.
And with a streamlined yet transparent system working 24×7 on this, it becomes significantly easier for the platform to take action more efficiently and in a swifter way.
In Conclusion
TikTok is a wildly popular app. Whether it’s lifestyle, beauty, media, travel, food, or sports—You’ll find high-quality and valuable content in almost any field and category.
What’s also true, sadly, is that offenders will misuse the reach of the platform to hurt or misguide unsuspecting users.
The platform is taking a step in the right direction. Hopefully, this new moderation system will make the process more streamlined, efficient, and most importantly, faster.