KARACHI: TikTok has updated its community guidelines to make it available in Urdu to fans in Pakistan, the company said on Thursday, just two weeks after representatives of the Chinese video-sharing app and Pakistan Telecommunications Authority (PTA) had an approval. and further on “obscene” content.
The PTA issued a final warning to the social media company on July 21, asking it to “moderate socialization and content within legal and moral boundaries, in accordance with the laws of the country.”
A press release issued today states that the provision of community guidelines in Urdu is aimed at “maintaining a supportive and welcoming environment for Pakistani users” with the aim of providing more space for fun and enjoyment. expression. creative as it is more and more popular in the country. .
“To address this issue, TikTok has released an updated Urdu Community Guidelines article that will help and maintain a favorable and welcoming environment on TikTok for Pakistani users,” the company said.
He pointed out that the Community Guidelines provide guidance on “what is and is not allowed on the platform, keeping TikTok as a safe place for creativity and fun, and is localized and implemented in accordance with local laws and regulations. “
“TikTok teams will remove content that violates community guidelines and suspend or ban accounts involved in serious or repeated violations.
“Content moderation is done by implementing a combination of moderation policies, technologies and strategies to detect and assess problematic content and accounts, and implement appropriate sanctions,” he added.
Pakistan is one of the major markets with the most deleted videos
The Chinese app also noted that, according to its latest transparency report, Pakistan was “one of the top 5 markets with the highest number of videos removed from TikTok for violating community rules or terms of service.”
“This demonstrates TikTok’s commitment to removing potentially dangerous or inappropriate content reported in Pakistan.”
TikTok said its systems are configured to automatically flag certain types of content that violate community rules, “so you can act quickly and mitigate potential damage.”
“These systems take into account things like patterns of behavior or clues to report potentially infringing content,” he added.
Citing the lack of technological advancements to enforce his policies, he stressed that “context may be important in determining whether certain content, such as satire, is in violation”.
“In some cases, this team removes changing content or trends, such as dangerous challenges or harmful disinformation,” he said.
The app stated that the content moderation was also based on reports from TikTok users who used the in-app reporting feature “to report potentially inappropriate content or accounts on TikTok”.