Thursday, April 18, 2024

TikTok Removed more than 300,000 videos for Spreading Misinformation on US Election

TikTok removed more than 340,000 videos in the United States for violating, operating, or misinforming the platform’s election misinformation policies.

This emerged from Wednesday’s transparency report, which contained content from the second half of last year.

A few months before the 2020 presidential election, TikTok announced it would work closely with information organizations to deal with COVID-19 elections and misinformation.

During the announcement, TikTok came under tremendous pressure from the Trump administration and lawmakers over alleged ties to the Chinese government.

Microsoft and Oracle are among companies competing for TikTok owned by Chinese company ByteDance. The final agreement between Oracle and TikTok is still pending.

TikTok is a mixed global community driven by creative expression. We work to create an environment where everyone can feel safe, communal and have fun. Said Michael Beckerman, TikTok vice president and US policy director,

in a statement. On Wednesday.

“We committed to being transparent about how we can protect our platform, as this helps build trust and understanding in our community.”

Beyond videos that removed for violating voting rules, TikTok announced on Wednesday that more than 441,000 videos removed from the platform’s recommendation algorithm.

Just before the 2020 election, TikTok released an election guide powered by BallotReady, a voting information tool.

This guide is available on the Find page, as well as in election-related videos, hashtags and political reports. It was visited 18,000,000 times.

TikTok also deleted 1,750,000 accounts “used for automation” during the 2020 US elections. While it is not known whether any of the accounts were specifically used to expand election-related content.

It is important to delete those accounts to protect the platform at this critical time,” said transparency report.

Latest news