Apr 13, 2026
Indonesia’s Ministry of Communication and Digital Affairs (Komdigi) has stated that Meta has met the child-protection requirements assessed under PP Tunas, while TikTok and Roblox are currently categorized as "partially cooperative." According to the regulator, the latter two platforms have submitted written commitments and are still making gradual adjustments to align with the rule.
For companies operating digital platforms, this is an important signal: child-safety governance is no longer a secondary policy issue. It is increasingly becoming part of active regulatory scrutiny in Indonesia.
Based on Bloomberg Technoz reporting, Komdigi’s Director General for Digital Space Oversight, Alexander Sabar, said Meta had fulfilled its child-protection obligations and was declared compliant with the applicable PP Tunas policy requirements.
By contrast, TikTok and Roblox were described as only "partially cooperative." The regulator said both companies had submitted written commitments and were in the process of implementing gradual adjustments to comply with Government Regulation No. 17/2025 on Electronic System Governance for Child Protection (PP Tunas).
This is more than a platform-specific enforcement update. It shows that Indonesia is beginning to translate child-protection regulation into platform-level compliance expectations. For digital businesses, that means regulators may increasingly assess not only stated policies, but also the operational effectiveness of safety controls, governance structures, and implementation timelines.
From Bitlion’s perspective, this development is an early sign of a more operational enforcement environment for child-related digital compliance in Indonesia. The distinction between "compliant" and "partially cooperative" matters, because it suggests regulators are willing to evaluate progress in concrete terms rather than treat all platforms equally.
For organizations in technology, social platforms, gaming, digital communities, and other user-generated ecosystems, this means child protection should be treated as a governance issue, not only a content-moderation issue.
Meta’s reported compliance status, compared with TikTok and Roblox’s partially cooperative position, illustrates how Indonesia’s digital child-protection framework is beginning to shape platform accountability in practice. For companies, the message is straightforward: regulators are not only asking whether commitments exist, but whether they are being implemented in a credible and timely way.
Experience the power of AI-driven compliance automation and take your security posture to the next level.