by Almira Louise S. Martinez, Reporter
TikTok, a short-form video social media platform, encourages its Filipino users to report misinformation and harmful content on the platform in line with the Philippines’ 2025 midterm elections.
“In a global community, it’s natural for people to have different opinions but our goal is to operate on a shared set of facts and reality,” Peachy A. Paderna, Philippine Public Policy Manager at TikTok, told reporters on Thursday.
In January, the social media platform launched an in-app Philippine Elections Center site in partnership with the Commission on Elections (COMELEC), the National Citizens’ Movement for Free Elections (NAMFREL), and the Legal Network for Truthful Elections (LENTE) to avoid the spread of misinformation, and promote reliable and trustworthy election-related content.
TikTok’s Philippine Election Center site houses verified “critical election resources” such as voting procedures, polling locations, key election dates, and other essential information regarding elections.
Peachy Paderna, Philippine Public Policy Manager at TikTok. | photo by Almira Louise S. Martinez, BusinessWorld
According to Ms. Paderna, the platform’s community guidelines are based on three key themes to ensure the safety of its users – balancing harm prevention and expression, embracing human dignity, and ensuring actions are fair.
“We also rely on the larger TikTok community to help us spot content that we may not have caught in the initial phase,” Ms. Paderna added.
Although the social media company has over 40,000 professionals and machine technology that handles content moderation, Ms. Paderna said users are still encouraged to report harmful content.
Users can find the in-app report button under the share feature of the platform. Violence, hate and harassment, self-harm, nudity, and misinformation are some of the available reasons to file a report.
“We want to make sure that our community of users stays safe even as we promote the diversity of ideas on the platform,” she said. “We do not allow misinformation that may cause significant harm to individuals or society regardless of intent.”
Reported accounts and videos
From July to September 2024, the video hosting site took down 4.5 million videos in the Philippines, of which 99.7% were removed proactively due to violations of the platform’s community guidelines. In addition, 98% of the reported videos were removed within 24 hours.
“When content is taken down or acted on by our enforcement team, that doesn’t necessarily mean that the [content creator’s] account will be taken down all the time,” Ms. Paderna said.
Getting banned on TikTok depends on the gravity of the violations. “Sometimes all it takes is one post, sometimes it takes multiple posts,” the TikTok executive added.
The severity of violations can be categorized as significant and moderate harm. Content that leads to severe forms of physical harm, such as life-threatening injury or death, falls under ‘significant harm’. Meanwhile, moderate harm is false or misleading content regarding treatments or prevention of health-related issues that could not lead to life-threatening concerns.
Ms. Paderna noted that mass reporting would not help the video or account be removed from the platform.
“One thing that we want to remind everybody is that it’s not a matter of people reporting one account,” she said. “We don’t need multiple reports to take down or pay attention to a violation.”
“We want to ensure that actions are fair so that when we take enforcement action, it’s always in fair way, it’s always just, it’s always rational,” Ms. Paderna said.