As part of the upload process now, all videos go through an automated screening system, which identifies potential policy violations for further review by a member of the moderation team.
Then, a member of the moderation team tells the user if a violation is detected. But on the TikTok scale, that leaves room for error before the review is complete.
The platform is now working to improve this, and wants to ensure that potentially infringing material does not reach any viewers.
“Over the next few weeks, we will begin using technology to automatically remove certain types of violating content that have been identified on upload,” she said. In addition to removals confirmed by our security team. Automation is maintained for content categories where our technology has the highest degree of accuracy.
Rather than allowing potential violations to pass through, TikTok’s system now prevents them from being downloaded, which may help limit their viewing.
This process may cause some concern for content creators. But Tik Tok confirmed that its detection systems have proven high accuracy.
“We found the false positive rate for automatic removals to be 5 percent,” she said. Requests to petition for the video’s removal remained unchanged. We hope to continue improving our accuracy over time.
Read also: Tik Tok allows you to apply for a job
Tik Tok limits harmful content
In addition to improving the overall experience across TikTok, the platform hopes this update will also support resilience within its security team by reducing the amount of annoying videos admins watch and enabling them to spend more time in highly contextual and nuanced areas.
In addition, the platform introduces a new view of account violations and reports, in order to improve transparency, preventing users from going beyond the limits.
The new system displays the violations that each user has accumulated, while also seeing new warnings displayed in different areas of the app as reminders.
Sanctions for such initial warnings escalate to full bans, based on recurring problems. Whereas for more serious issues, TikTok removes accounts automatically. It can also block the device to prevent future accounts.
These are important measures, especially given the platform’s young user base.
Internal data published by the New York Times last year showed that about a third of the platform’s user base is 14 years old or younger, which means that there is a significant risk that youngsters may face within the application.
The platform faced several investigations on this front, including temporary bans in some regions due to its content.
Read also: Tik Tok allows users to pay for videos