The United Arab Emirates (“UAE”) has introduced a new and wide-reaching framework for protecting children online under Federal Decree-Law No. 26 of 2025 Regarding Child Digital Safety (the “CDS Law”).
The CDS Law entered into force on 1 January 2026 and provides a one-year compliance grace period, with full enforceability expected from January 2027.
The CDS Law aligns with the UAE’s increased focus in recent years on responsible digital governance and is consistent with global trends toward stronger regulation of online child safety.
Key takeaways
Like other online safety regulations, such as the EU Digital Services Act and the UK and Australian Online Safety Acts, the CDS Law has extra-territorial reach. It applies to online platforms and other internet service providers that operate in, or target users in, the UAE. It signals a clear regulatory expectation that the safety of children (persons under 18 years of age) will be addressed through governance, product design, age assurance, content controls and privacy-by-default. This represents a significant development in the UAE’s digital regulation and will have far-reaching operational consequences for digital platform operators.
While detailed requirements and penalties are expected to be set out in forthcoming implementing regulations, the core obligations are already in place and should be treated as a near-term compliance priority.
Scope
The CDS Law is drafted broadly and applies to digital platforms operating in the UAE, or directed at users in the UAE. It provides a non-exhaustive list of in-scope digital platforms, including websites facilitating user interaction, online search engines, smart applications and messaging applications/forums, gaming platforms, social media, live-streaming and audio content platforms (podcasts), online streaming and on-demand visual content services, online marketplaces and other e-commerce platforms. This breadth matters for multinational groups, as a platform that is not established in the UAE may still be exposed if it targets the UAE market.
A digital platform classification system will be issued via Cabinet Resolution, which will classify digital platforms based on a risk assessment (taking into account factors such as content, type, reach, and impact). This classification system will be pivotal, as it will ultimately determine the extent of obligations applicable to the different risk clusters of digital platforms. For organisations, this points to a future “tiered” model of compliance: higher-risk services should expect more stringent requirements (including stronger age assurance and more robust content controls).
The CDS Law also places obligations on internet service providers (i.e. Telecommunications and Digital Government Regulatory Authority licensees) and introduces duties for child caregivers (parents/guardians).
Core obligations for digital platforms
Even ahead of implementing regulations, the CDS Law indicates the types of measures regulators expect to see as baseline controls. Important obligations are discussed below.
Children’s data and privacy
Digital platforms are now prohibited from collecting, processing, publishing, or sharing the personal data of children under the age of 13, unless the specific and verifiable conditions are met (including explicit/documented parental consent, an easy withdrawal mechanism, and clear privacy policy disclosures).
The CDS Law also restricts using such data for commercial purposes, including prohibiting targeted electronic advertisements to children under 13 and tracking beyond the originally authorised purpose.
Certain education and health platforms may be exempted (subject to safeguards) by impending Cabinet Resolutions.
Businesses should begin mapping where child data is collected, how it is used, and whether advertising technology or profiling practices could indirectly involve minors, then build controls that support age-appropriate design. This must of course be done in alignment with the UAE’s federal Personal Data Protection Law (notwithstanding its pending Executive Regulations).
No “online commercial gaming” for children
The CDS Law prohibits digital platforms from permitting children to participate in, create accounts for, or access online commercial gaming (directly or indirectly). Online commercial gaming is defined as digital games offered with a primary purpose of generating (directly or indirectly) revenue for its operator, including gambling and activities involving placing bets and wagers for monetary consideration or anything of value.
Age verification
Digital platforms must adopt “effective and reasonable” age verification mechanisms, standards and procedures. Such measures must take into account their risk classification and potential impact of their content on children.
Enhanced child protection controls
The CDS Law’s concept of enhanced child protection measures include the following obligations on digital platforms (according to their risk classification):
- applying the highest level of privacy-by-default settings to children’s accounts;
- providing age-based controls and restrictions on platform use (including age verification mechanisms);
- activating blocking and filtering tools, content age-classification tools, and disabling features tied to excessive interaction/participation by children (by age group);
- providing guidance tools and software for parental controls, including daily time limits and mandatory rest/disconnection periods from devices;
- enhancing awareness measures about risks from excessive/uncontrolled platform use;
- providing user-friendly tools for reporting harmful content or behaviours; and
- proactively detecting, removing and reporting harmful content or behaviours using technical capabilities (including artificial intelligence systems and machine learning algorithms).
Periodic disclosure
It is also an obligation to periodically disclose the platform’s user and content management policies. Detailed disclosure requirements are still being developed and will be set by the impending risk classification system. In addition, digital platforms must comply with implementing orders issued by UAE authorities, and providing statistics and periodic reporting to them regarding measures taken to comply with the CDS Law.
Governance and enforcement
A Child Digital Safety Council (chaired by the Ministry of Family) will also be established to co-ordinate policies, strategies and directions in relation to child online safety. Compliance with the CDS Law will be monitored by UAE authorities, including those responsible for child affairs, media, telecommunications and cybersecurity.
A separate Cabinet instrument is also expected to set out the administrative penalty regime (in addition to the risk of closure or blocking due to non-compliance which is already outlined in the CDS Law).
Practical next steps
The CDS Law goes beyond prior UAE regulations by introducing a dedicated, child-focused framework that imposes newly introduced obligations - many of which have not previously applied in the UAE - such as age verification, enhanced child protection controls, and stricter limits on the collection and use of children’s data, meaning digital platforms will need to familiarise themselves with and operationalise these requirements as part of their compliance programmes.
The CDS Law is just one new framework among many online safety and online child protection regulatory developments globally. In practical terms, we recommend:
- confirming in-scope services and UAE user exposure;
- mapping the CDS obligations against the compliance you already have in place for your services under the EU DSA, UK OSA, AU OSA and similar regulatory frameworks;
- running a gap assessment against the CDS Law’s expected controls (age assurance, implementing privacy-by-default settings, parental tools, reporting and moderation); and
- aligning legal and operational functions so that controls are implemented, documented and explained to users and regulators in the Terms of Service, dedicated FAQ or on a child safety centre page for your services. This can include: (1) updating Terms of Service and privacy notices and consents; (2) implementing age-verification measures; (3) deploying high privacy-by-default settings for child accounts; (4) deploying parental controls; (5) implementing proactive content monitoring and reporting tools; (6) reviewing marketing practices to ensure compliance; and (7) preparing for platform classification and ongoing disclosure and transparency requirements.
We also strongly recommend actively tracking publication of the implementing regulations (classification criteria, technical standards, and penalties) and regular benchmarking of the developing industry standards in the UAE so that design decisions taken now remain compatible with the final regime.
[View source.]