TikTok Introduces Account Strikes to Improve Moderation System

Camden Price

2023-02-03

blog image

TikTok recently announced an update to its moderation system which will implement account strikes. This is stated on the official website of the application. This is similar to YouTube’s community guidelines strikes, but with a few key differences. The new policy is aimed at improving the platform’s moderation system and preventing bad actors from exploiting it.

The update will introduce account strikes which last for 90 days. If TikTok removes content, such as a video or comment, for violating a community guideline, the account behind it will get a strike.

This is intended to help users better understand why their content was removed and how they can avoid similar mistakes in the future. Additionally, the new system will help TikTok better detect and prevent inappropriate content from being posted in the first place.

The update also includes measures to help users recover from account strikes, such as offering tips for creating content that complies with the community guidelines. Users can also appeal the decision if they believe their content was mistakenly removed.

The new policy is intended to create a more user-friendly moderation system and help protect against bad actors. This, combined with the measures to help users recover from account strikes, should ensure that the platform remains safe and enjoyable for everyone.

What do you think of the new account strikes policy? Let us know in the comments!

Follow: