TikTok removes over 25 million videos in Pakistan for violating guidelines in Q2 2025

The platform says 99.7% of removed content was proactively identified, and 96.2% was removed within 24 hours of posting.

TikTok removed more than 25.4 million videos in Pakistan between April and June 2025 for violating its community guidelines, according to the platform’s Q2 2025 Community Guidelines Enforcement Report.

The short video app says 99.7% of removed content was proactively identified and 96.2% was removed within 24 hours of posting.

Globally, TikTok removed 189.5 million videos during the quarter, representing approximately 0.7% of all uploads. Of these, 163.9 million were removed using automated detection tools, while 7.4 million were later restored after further review.

The company also removed 76.9 million fake accounts and 25.9 million accounts believed to belong to users under the age of 13.

According to the report, 30.6% of removed videos contained sensitive or adult themes, 14% violated safety and civility standards, and 6.1% violated privacy and security policies. Additionally, 45% of content was flagged for misinformation, while 23.8% included AI-generated or edited media.

TikTok said the quarterly report highlights its ongoing efforts to ensure a safe digital environment and maintain transparency. “Regularly publishing enforcement reports reflects our commitment to transparency and community safety,” the company said.

Read: Senate introduces bill to ban social media accounts for under-16s

Similarly, during the first quarter of 2025, TikTok removed nearly 25 million videos in Pakistan, according to its Q1 2025 Community Guidelines Enforcement Report, which covers activity from January to March.

According to the report, a total of 24,954,128 videos were removed in Pakistan for violating the platform’s community guidelines. The proactive removal rate in the country remained exceptionally high at 99.4%, with 95.8% of reported videos removed within 24 hours of posting.

The report further reveals that 30.1% of all videos removed globally contained sensitive or adult themes, making this the most common reason for enforcement.

Other violations included violations of privacy and security guidelines (15.6%), security and civility standards (11.5%), misinformation (45.5%), and use of edited media or AI-generated content (13.8%).

TikTok said its quarterly enforcement reports are part of its ongoing commitment to transparency and accountability. The company noted that the reports are designed to help users, regulators and the general public better understand how content moderation is carried out at scale and what types of violations are most frequently addressed.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top