Section 9(2)
Section 9(2) of the DPDP Act introduces a broader standard that sits above specific prohibitions: no Data Fiduciary shall undertake processing that is “likely to cause any detrimental effect on the well-being of a child.” This is an intentionally open-ended provision.
It does not limit itself to tracking or advertising. It asks a substantive question about outcomes: could the way you use a child’s personal data cause them harm? This requires a genuine product ethics evaluation, not just a legal checklist review.
Example: A gaming platform uses variable reward schedules algorithmically optimised loot boxes designed to maximise in-app purchase conversion. Even if the platform does not “track” the child in the technical sense of monitoring cross-platform behaviour, the underlying system uses the child’s session and engagement data to optimise for spending behaviour in ways that could cause psychological harm. Under Section 9(2)’s well-being standard, this processing would be questionable regardless of parental consent. For gaming companies, short-video apps, and social platforms, this provision demands a genuine board-level conversation about product design — not just a compliance review.
Section 9(3): The Triple Lock: Absolute Prohibitions
Section 9(3) of the DPDP Act imposes three unconditional prohibitions on processing children’s personal data. These are not defaults that can be toggled off with parental consent. They are absolute bans:
- No tracking
You cannot track a child’s location, device usage, browsing patterns, or activity across your own platform or third-partyservices regardless of parental approval. - Nobehavioral monitoring
You cannot build profiles based on a child’s behavior, preferences, engagement patterns, or usage history. Personalized recommendation engines, engagement-optimization algorithms, and behavioral analytics for users identified as children are all prohibited under this provision. - No targeted advertising
You cannot servepersonalised ads to children based on their personal data, inferred interests, or behaviour. Advertising to child users, if any, must be purely contextual — based on the content being viewed, not the user viewing it.
These prohibitions reflect a deliberate legislative philosophy: that certain processing activities are inherently harmful to children, regardless of whether a responsible adult has authorized them. A parent cannot consent to their child being profiled or targeted, for the same reason a parent cannot consent to their minor child entering a binding commercial contract. The prohibition is on the harm, not merely on the lack of consent.
For platforms that monetize through behavioral advertising, which describes most ad-funded consumer apps – this means a fundamentally separate product experience for child users is not a nice-to-have. It is a legal requirement.
Exemptions: Where Rule 10 Provides Relief
Rule 10 of the DPDP Rules 2025 establishes a framework under which certain categories of Data Fiduciary may be exempt from the Section 9(1) verifiable parental consent requirement and the Section 9(2) well-being prohibition, but only for specific, defined purposes. Further sectoral exemptions may be issued through government notifications under the Rules.
Healthcare providers are exempt from the verifiable parental consent requirement when processing a child’s personal data for medical treatment purposes. A hospital treating a child in an emergency cannot wait for a parent to complete a DigiLocker verification flow before accessing health records; the exemption exists because the alternative would cause harm.
Educational institutions processing student data for educational purposes have similar operational relief. A school using a learning management system for curriculum delivery, attendance tracking, and assessment can process student data for those academic purposes without the full parental consent mechanism, though all other DPDP obligations (data security, purpose limitation, data minimization, retention policies) continue to apply in full.
These exemptions are narrow and purpose specific. A hospital’s exemption covers treatment not its marketing or patient loyalty database. A school’s exemption covers academic use, not third-party ed-tech vendor integrations serving personalised content.
The exemption follows the purpose, not the organisation. Any processing that falls outside the specific exempted purpose requires full Section 9 compliance.