[TMT] TECH Alert: Protecting Children in the Age of Invisible Frontiers: Age Verification and Child Safety under the Online Safety Act 2025

INTRODUCTION

As technology evolves at an unprecedented pace, so too do the harms that accompany it – children, in particular, remain among the most vulnerable in an increasingly dangerous digital world. On 1 January 2026, the Online Safety Act 2025 (“OnSA”) came into force, placing enforceable duties on Licensed Service Providers to manage harmful content and safeguard the vulnerable from exploitation and abuse.

Yet, the Grok controversy that shortly followed after the enactment of OnSA laid bare what a broad legislative framework alone cannot achieve, that duties enshrined in law must be matched with clear, practical guidance on how to fulfil them.

On 12 February 2026, the Malaysian Communication and Multimedia Commission (“MCMC”), exercising its powers under section 80 of OnSA, published for public consultation two (2) draft codes designed give OnSA’s duties practical force, namely:-

(i) the Draft Risk Mitigation Code; and
(ii) the Draft Child Protection Code.

 

The consultations, which closed on 13 March 2026, were intended to solicit views from industry players, civil society organisations, consumer groups and members of the public on, inter alia, the overall regulatory approach and the practical implementation of the draft codes.

As at the date of publication of this article, MCMC has not released the feedback received.

Nevertheless, the outcome of these consultations will inform the final form of these instruments and with them, the enforceable compliance obligations that Licensed Serviced Providers in Malaysia will be required to meet.

In this article, we will briefly recap the key areas covered by the Draft Child Protection Code and examine the potential challenges that the proposed U-16 ban may face.

EXPAND ARTICLE

DRAFT CHILD PROTECTION CODE AND ITS CHILD PROTECTION IMPERATIVE
Drawing from the regulatory approaches of various jurisdictions, including Australia and the United Kingdom, the Draft Child Protection Code (“CPC”) is intended to supplement the duties imposed under section 18 of OnSA, specifying the measures that Licensed Service Providers must implement to ensure the safe use of their services by child users.

The CPC zeroes in on five principal areas, each carrying its own set of compliance obligations for Licensed Service Providers, which are as follows:

 

Content
Moderation
Licensed Service Providers are required to proactively detect and remove harmful content before it reaches children, maintain child-friendly reporting channels, and respond promptly to removal requests from the MCMC or enforcement agencies.
Parental Controls Licensed Service Providers are required to make available parental control features that enable parents to monitor and adjust their child’s platform activity in line with the child’s age, developmental stage, and evolving capacity.
Privacy and Safe Settings Licensed Service Providers are required to set default privacy and safety settings for child accounts to the most restrictive level, restrict direct messaging between children and unknown adults, prevent exposure to manipulative design features, and implement clear child-friendly reporting mechanisms for harmful content.
Search and
Recommendation
Systems
Licensed Service Providers are required to ensure that search and recommendation algorithms are designed and operated to actively filter harmful content from child users’ feeds, with child users and their parents given accessible tools to manage personalised recommendation system.
Age-Verification
and U-16 Ban
Where a service is likely to be accessed by child users, Licensed Service Providers are required to implement effective age verification measures to ensure that only users aged 16 years and above are permitted to register, use, and access any feature of the service appropriate for their age. Critically, this verification must be conducted against Government-issued records, moving well beyond the self-declaration methods that have proven easy to circumvent.

The CPC further requires that any personal data collected for the purpose of age verification be processed strictly in accordance with the Personal Data Protection Act 2010, ensuring that the introduction of verification mechanisms does not inadvertently create new data privacy risks for the children they are designed to protect.

 

POTENTIAL CHALLENGES U-16 BAN

Of the five principal areas covered by the CPC, none is more significant or far-reaching than the mandatory age verification requirement. This measure constitutes a blanket prohibition on access to platforms by individuals under the age of 16, a measure comparable to the one implemented in Australia.

A measure of such sweeping breadth naturally invites examination through the lens of fundamental liberties. In Australia, the U-16 ban was met with a constitutional challenge when the Digital Freedom Project commenced legal action in the High Court in November 2025, arguing that it violates the implied freedom of political communication under the Australian Constitution.

While the constitutional frameworks of Malaysia and Australia differ materially, such proceedings illustrate the kinds of questions that a Malaysian U-16 ban may give rise to. In Malaysia, freedom of speech and expression is guaranteed to every citizen under Article 10(1)(a) of the Federal Constitution. That said, Article 10(2)(a) expressly permits
Parliament to restrict this right by law, on grounds including public order or morality. A measure designed to protect children from online harm could plausibly engage either of these grounds, meaning that the constitutional case against the U-16 ban is not, on its face, straightforward.

The more pointed questions, in the Malaysian context, touch on both constitutionality and legality; and on these fronts, two distinct concerns arise.

The first concerns scope. Section 18 of OnSA, the provision under which the MCMC seeks to implement the ban, envisions measures to ensure the “safe use” of services by child users. This formulation is more naturally read as contemplating regulated access rather than outright prohibition. A blanket ban may therefore exceed what section 18 actually authorises, raising the question of whether the MCMC has acted beyond its enabling provision.

The second concerns form. Article 10(2)(a) of the Federal Constitution makes clear that any restriction on free speech must be imposed by Parliament. A Code is a subordinate instrument; it derives its authority from a parent Act and is issued by a regulatory body. The question that follows is whether a measure that restricts a constitutionally guaranteed right can be validly imposed through a Code, or whether it demands a more express parliamentary mandate in the form of primary legislation.

 

CLOSING REMARKS

The OnSA and the draft codes proposed by the MCMC reflect Malaysia’s commitment to keeping pace with the evolving digital landscape, particularly in safeguarding children from online harm. The proactive steps taken in this regard are to be acknowledged.

That said, good law must be more than well-intentioned, it must also be constitutionally and procedurally sound. The legal questions surrounding the U-16 ban warrant careful consideration as the CPC moves towards finalisation, for how they are resolved will determine not only whether the ban achieves its protective purpose, but whether it does so within the boundaries that the law prescribes.

 

If you have any queries, please contact Senior Associate Harvey Ng Yih Xiang (nyx@lh-ag.com), Associate Kamiliacheng (kca@lh-ag.com) or his team Partner, G. Vijay Kumar (vkg@lh-ag.com).

Share this article

Partners

Learn more about our partners who specialize in this area

G. Vijay Kumar

Partner

G. Vijay Kumar

Partner