In the newly published Transparency Report, TikTok discloses how many videos were deleted in the second quarter of 2021 because they violated the app’s guidelines.

Like all social media platforms, TikTok is faced with the challenge of deleting inappropriate content before it is in the App spread. The community guidelines and terms of use specify which content is allowed on the platform and which is not. in the Community Guidelines Enforcement Report For the second quarter of 2021, TikTok is now setting out the measures that have been taken to remove unwanted content from the platform.

81 million videos deleted due to problematic content

Between April and June 2021, TikTok reports, 81,518,334 videos were removed because they violated TikTok’s guidelines. That corresponds to less than one percent of the videos uploaded in the period. The most important thing is the time it takes to remove a video from the platform. TikTok is designed to let content go viral quickly and to display it to a large number of users. As a result, the company is working to proactively delete content even before it is reported. The report found that 93 percent of deleted videos were identified and removed within 24 hours of their posting, and 94.1 percent before another user reported the video. 87.5 percent of the removed content was not seen by any user of the platform.

The decision is made automatically by systems and human moderators

Both in this area and especially in the detection of hateful behavior, bullying and harassment, TikTok can show an increase compared to the first quarter. For example, 73.3 percent of harassment and bullying videos were removed before reporting – compared with 66.2 percent in the first quarter of this year. According to TikTok, this can be traced back to improved systems that proactively identify hate symbols, words and other abuse signals and thus provide them for further review by security teams. These are trained regularly. After all, without context it can often be difficult to differentiate, for example, between appropriation and insult or satire and bullying.

Each platform has to find its own way to deal with content moderation properly. Not only the timely deletion of critical content is problematic. The unlawful deletion of critical content is also a minefield that has to be navigated. A few months ago, the creator Ziggi Tyler went public after TikTok in his profile blocked the statement “Black Lives Matter”. All of his attempts to include the word “Black” in his bio failed. Words like “neo nazi” or “white supremacy” were accepted. Stories like Tyler’s show that while TikTok is making progress, it’s not all about making correct decisions. Other platforms are also struggling with the amount of content that is uploaded every day. YouTube recently deleted two videos from the #allesdichtmachen campaign. The case landed before the Regional Court of Cologne, which ruled that YouTube removed the videos too quickly.

With the increasing number of users, which TikTok in particular has to show, it is likely to be more difficult than easier for the platform to find the right way to deal with content that is not desired on the platform. The app only cracked at the end of September one billion active users.

Etail Consumer Research

With this free, English whitepaper from Treasure Data, you will quickly receive actionable insights and recommendations for action on the topics of empathy, community building and brand engagement.

Download now





Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here