Back to Blogs
Compliance

Meta Clears Indonesia’s PP Tunas Child Protection Review, While TikTok and Roblox Remain Partially Cooperative

M. Ishaq Firdaus M. Ishaq Firdaus Apr 13, 2026
Meta Clears Indonesia’s PP Tunas Child Protection Review, While TikTok and Roblox Remain Partially Cooperative
Table of Contents

Executive Summary

Indonesia’s Ministry of Communication and Digital Affairs (Komdigi) has stated that Meta has met the child-protection requirements assessed under PP Tunas, while TikTok and Roblox are currently categorized as "partially cooperative." According to the regulator, the latter two platforms have submitted written commitments and are still making gradual adjustments to align with the rule.

For companies operating digital platforms, this is an important signal: child-safety governance is no longer a secondary policy issue. It is increasingly becoming part of active regulatory scrutiny in Indonesia.

What Happened

Based on Bloomberg Technoz reporting, Komdigi’s Director General for Digital Space Oversight, Alexander Sabar, said Meta had fulfilled its child-protection obligations and was declared compliant with the applicable PP Tunas policy requirements.

By contrast, TikTok and Roblox were described as only "partially cooperative." The regulator said both companies had submitted written commitments and were in the process of implementing gradual adjustments to comply with Government Regulation No. 17/2025 on Electronic System Governance for Child Protection (PP Tunas).

Why This Matters

This is more than a platform-specific enforcement update. It shows that Indonesia is beginning to translate child-protection regulation into platform-level compliance expectations. For digital businesses, that means regulators may increasingly assess not only stated policies, but also the operational effectiveness of safety controls, governance structures, and implementation timelines.

Bitlion View

From Bitlion’s perspective, this development is an early sign of a more operational enforcement environment for child-related digital compliance in Indonesia. The distinction between "compliant" and "partially cooperative" matters, because it suggests regulators are willing to evaluate progress in concrete terms rather than treat all platforms equally.

For organizations in technology, social platforms, gaming, digital communities, and other user-generated ecosystems, this means child protection should be treated as a governance issue, not only a content-moderation issue.

What Companies Should Do Next

  1. Review child-safety controls across product, policy, moderation, and reporting workflows.
  2. Assess regulatory readiness against the requirements and intent of PP Tunas.
  3. Document implementation status so the company can show evidence of action, not just policy language.
  4. Strengthen cross-functional ownership between legal, compliance, policy, product, trust and safety, and engineering teams.
  5. Prepare for regulator scrutiny that may increasingly focus on practical outcomes and remediation progress.

Closing Note

Meta’s reported compliance status, compared with TikTok and Roblox’s partially cooperative position, illustrates how Indonesia’s digital child-protection framework is beginning to shape platform accountability in practice. For companies, the message is straightforward: regulators are not only asking whether commitments exist, but whether they are being implemented in a credible and timely way.

 

Transform Your Compliance Journey Today

Experience the power of AI-driven compliance automation and take your security posture to the next level.