TikTok has declared war on fake news. The Chinese social network has rolled out a system of warning messages flagging videos that include unverified content. This new feature aims to reduce the spread of videos containing inauthentic, misleading or false content by prompting users to think before they share.
"Caution: Video flagged for unverified content." That's the message that users in North America will now see on certain TikTok videos. The Chinese platform has announced measures to bolster its fight against fake news published and shared by its users. The feature launched in Canada and the US, Wednesday, February 3. It should roll out worldwide in the coming weeks, TikTok said.
Basically, if the information in a video cannot be officially validated by fact-checkers, TikTok will flag the video with a warning message alerting users to the unverified content. The step comes as a major precaution at a time when fake news can quickly go viral, particularly on subjects like the covid-19 pandemic or vaccines.
The creator of the video in question will also be notified about the decision to flag their video for unsubstantiated content, reminding them of the platform's rules of use. While the video will remain available on the application, users will be alerted to its unverified content whey they try to share it. A message will appear to remind users that the video has been flagged, causing them to pause to consider their next move before choosing to "cancel" or "share anyway." And this way of discouraging users from sharing unverified information appears to work. When testing this approach, TikTok saw viewers decrease the rate at which they shared videos by 24%, while likes on this kind of unsubstantiated content also decreased by 7%.
Twitter previously rolled out a similar function at the time of the US presidential election. As a result, the platform saw a reduction in retweets of posts featuring information flagged as dubious.
TikTok has partnered with fact-checkers at PolitiFact, Lead Stories and SciVerify to help assess the accuracy of content, although the social network hasn't stated exactly how many videos are checked each day or how the fact-checkers choose which videos to investigate. Note that videos that infringe the TikTok community guidelines are automatically deleted before even getting published.