7th - Last date to pay TDS. Talk to an expert on +91 8939 121 121
11th - Last date to file GSTR-1. Talk to an expert on +91 8939 121 121
20th - Last date to file GSTR-3B & Professional Tax. Talk to an expert on +91 8939 121 121
DPDP Compliance Is Now Live. Is Your Business Ready? Read More
7th - Last date to pay TDS. Talk to an expert on +91 8939 121 121
11th - Last date to file GSTR-1. Talk to an expert on +91 8939 121 121
20th - Last date to file GSTR-3B & Professional Tax. Talk to an expert on +91 8939 121 121
DPDP Compliance Is Now Live. Is Your Business Ready? Read More
Search Bar with Typing Effect Placeholder
Consult with an expert
Search Bar with Typing Effect Placeholder

Children's Data Under DPDP: Parental Consent, Age Verification, and the Rules Every Business Must Follow

A 14-year-old in Bengaluru downloads a gaming app. Within minutes, the app has collected their name, age, device identifier, real-time location, and the contact list from their phone. The platform uses this data to serve personalized ads, track in-app behaviour to optimise engagement loops, and share data with third-party ad networks. Until recently, none of this was technically illegal in India. 

That has changed. Section 9 of the DPDP Act 2023 and Rule 10 of the DPDP Rules 2025 impose the most operationally demanding compliance regime in the entire Act. If you build products used by children — or products where children might realistically end up as users — this is not a section you can delegate to your legal team and revisit in six months. 

This is the fifth article in the DPDP compliance series at The Startup Zone. Earlier articles covered the DPDP Act fundamentals, consent management under the DPDP Act, significant data fiduciary obligations, and data breach reporting. forthcoming in this series This one covers the framework that catches ed-tech founders, gaming studios, social platforms, and consumer app builders most off-guard. 

Who Is a “Child” Under the DPDP Act?

The Act draws a single bright line: anyone under 18 is a child. No exceptions, no graduated age bands, no platform-specific carve-outs. 

This is meaningfully more protective than most comparable global frameworks: 

Jurisdiction Applicable Law Age Threshold
India DPDP Act 2023, Section 9 Under 18
European Union GDPR Article 8 Under 16 (member states may lower to 13)
United States COPPA Under 13
United Kingdom UK GDPR / Age Appropriate Design Code Under 18 (for design standards)

The Indian threshold sweeps far wider than COPPA and most EU implementations. A 17-year-old on a social media platform, a 15-year-old shopping on an e-commerce site, a 10-year-old playing an online game,all fall under Section 9’s full protective regime.  

For most consumer-facing digital products in India, the realistic answer to “could a child be a user?” is yes, which means these rules apply to you. 

Section 9(1): The Verifiable Parental Consent Requirement 

Section 9(1) of the DPDP Act is unambiguous: before processing any personal data of a child, a Data Fiduciary must obtain verifiable consent from the child’s parent or lawful guardian. The word “verifiable” carries the full legal and technical weight of this provision. 

A child clicking “I am over 18” does not count. A child typing their parent’s email address does not count. The law requires you to independently confirm that the person giving consent is actually the parent or guardian and that they are identifiable as an adult. 

The Two Verification Pathways Under Rule 10(1) 

Rule 10(1) of the DPDP Rules 2025 specifies that a Data Fiduciary must adopt appropriate technical and organizational measures to ensure verifiable parental consent and observe due diligence for checking that the individual identifying as a parent is an adult who is identifiable. The rule designates Aadhaar-linked DigiLocker tokens as the authoritative credential for independent identity verification. 

Pathway 1  

 Existing verified parent account: If the parent already uses your platform and their identity and age have been verified during their own registration, you can rely on that existing verification. The parent can authorise the child’s account from their own verified account. This is the lower-friction option and the preferred path for platforms with a dual adult-child user base. 

Pathway 2  

Independent DigiLocker verification: If the parent is not on your platform, their identity and age must be independently verified via Aadhaar-linked DigiLocker. The parent authenticates through DigiLocker, which issues a verified token confirming their identity as an adult. Your platform captures consent against that token — creating a verifiable, timestamped, auditable record of approval. 

In practice, building this requires three layers: an age-gating mechanism at registration to identify minor users, a parent verification flow that confirms identity without destroying the onboarding conversion rate, and a consent capture system that produces a durable audit record. For an early-stage startup, this is not a two-day engineering sprint. 

Example: An ed-tech platform onboards a 12-year-old student. The student enters their details and date of birth. The platform detects the user is under 18 and pauses registration. A verification request is sent to the parent’s registered mobile number. The parent authenticates via DigiLocker, which confirms they are an adult and issues a verification token to the platform. The parent reviews a plain-language notice of what data will be collected, for what purpose, and under what terms, and provides explicit consent. Only after this sequence completes does the student’s account become active, and the entire consent flow is logged with timestamps for any future audit.

Section 9(2) 

Section 9(2) of the DPDP Act introduces a broader standard that sits above specific prohibitions: no Data Fiduciary shall undertake processing that is “likely to cause any detrimental effect on the well-being of a child.” This is an intentionally open-ended provision. 

It does not limit itself to tracking or advertising. It asks a substantive question about outcomes: could the way you use a child’s personal data cause them harm? This requires a genuine product ethics evaluation, not just a legal checklist review. 

Example: A gaming platform uses variable reward schedules algorithmically optimised loot boxes designed to maximise in-app purchase conversion. Even if the platform does not “track” the child in the technical sense of monitoring cross-platform behaviour, the underlying system uses the child’s session and engagement data to optimise for spending behaviour in ways that could cause psychological harm. Under Section 9(2)’s well-being standard, this processing would be questionable regardless of parental consent. For gaming companies, short-video apps, and social platforms, this provision demands a genuine board-level conversation about product design — not just a compliance review. 

Section 9(3): The Triple Lock:  Absolute Prohibitions 

Section 9(3) of the DPDP Act imposes three unconditional prohibitions on processing children’s personal data. These are not defaults that can be toggled off with parental consent. They are absolute bans: 

  1. No tracking
    You cannot track a child’s location, device usage, browsing patterns, or activity across your own platform or third-partyservices regardless of parental approval. 
  2. Nobehavioral monitoring 
    You cannot build profiles based on a child’s behavior, preferences, engagement patterns, or usage history. Personalized recommendation engines, engagement-optimization algorithms, and behavioral analytics for users identified as children are all prohibited under this provision. 
  3. No targeted advertising
    You cannot servepersonalised ads to children based on their personal data, inferred interests, or behaviour. Advertising to child users, if any, must be purely contextual — based on the content being viewed, not the user viewing it. 

These prohibitions reflect a deliberate legislative philosophy: that certain processing activities are inherently harmful to children, regardless of whether a responsible adult has authorized them. A parent cannot consent to their child being profiled or targeted, for the same reason a parent cannot consent to their minor child entering a binding commercial contract. The prohibition is on the harm, not merely on the lack of consent. 

For platforms that monetize through behavioral advertising, which describes most ad-funded consumer apps – this means a fundamentally separate product experience for child users is not a nice-to-have. It is a legal requirement. 

Exemptions: Where Rule 10 Provides Relief 

Rule 10 of the DPDP Rules 2025 establishes a framework under which certain categories of Data Fiduciary may be exempt from the Section 9(1) verifiable parental consent requirement and the Section 9(2) well-being prohibition, but only for specific, defined purposes. Further sectoral exemptions may be issued through government notifications under the Rules. 

Healthcare providers are exempt from the verifiable parental consent requirement when processing a child’s personal data for medical treatment purposes. A hospital treating a child in an emergency cannot wait for a parent to complete a DigiLocker verification flow before accessing health records; the exemption exists because the alternative would cause harm. 

Educational institutions processing student data for educational purposes have similar operational relief. A school using a learning management system for curriculum delivery, attendance tracking, and assessment can process student data for those academic purposes without the full parental consent mechanism, though all other DPDP obligations (data security, purpose limitation, data minimization, retention policies) continue to apply in full. 

These exemptions are narrow and purpose specific. A hospital’s exemption covers treatment not its marketing or patient loyalty database. A school’s exemption covers academic use, not third-party ed-tech vendor integrations serving personalised content.  

The exemption follows the purpose, not the organisation. Any processing that falls outside the specific exempted purpose requires full Section 9 compliance.

The Penalty  

Violations of Section 9 attract penalties of up to ₹200 crore under the DPDP Act’s Schedule. The global enforcement precedent makes clear this is what area regulators treat as a priority and that penalties are not merely theoretical. 

Under the GDPR, children’s data has been one of the most aggressively enforced areas of data protection law globally: 

  • Instagram (Meta) was fined €405 million (approximately ₹36,855 crore) by Ireland’s Data Protection Commission in 2022 for mishandling children’s data, specifically for default public-profile settings applied to minor users without adequate safeguards

 

  • TikTok was fined £12.7 million (approximately ₹1,372 crore) by the UK’s Information Commissioner’s Office in 2023 for allowing an estimated 1.4 million children under 13 to use the platform without appropriate parental consent mechanisms 

Both fines resulted from violations that are structurally similar to what Section 9 of the DPDP Act now prohibits in India: inadequate age verification, insufficient parental consent mechanisms, and behavioral data collection from minor users. India’s Data Protection Board has indicated children’s data will be an enforcement priority, and these precedents will inform how the board approaches investigation and penalty determination. 

What This Means for Ed-Tech Specifically 

India’s ed-tech sector sits at a unique compliance intersection. A school-contracted LMS may benefit from Rule 10’s educational exemption for core academic data, but the same platform’s engagement tracking, parental communication features, and third-party integrations very likely fall outside that exemption’s scope. The line between “educational processing” (potentially exempt) and “behavioral profiling for product improvement” (not exempt and prohibited under Section 9(3)) must be drawn explicitly in every ed-tech company’s data processing documentation. 

For ed-tech platforms that have been designated or are likely to be designated as Significant Data Fiduciaries, children’s data adds a further compliance layer: DPIA requirements under Rule 13, algorithmic due diligence obligations, and annual independent audit scrutiny will all apply to the child-user processing stack. See our SDF compliance guide for the full picture. 

Practical Compliance Steps 

Step 1: Audit your user base 
Assess whether children could realistically be among your users. For consumer apps in gaming, social media, ed-tech, OTT entertainment, and e-commerce, the answer is almost certainly yes. Document this assessment. 

Step 2: Implement age gating 
Build a mechanism to identify when a user is under 18 at the point of registration or first data collection. Self-declaration alone is not sufficient as a sole mechanism,  it must be paired with the downstream verification flow for users identifying as minors. 

Step 3: Build parental verification flows 
Design and develop a system using either the existing account pathway or the DigiLocker pathway under Rule 10(1). Capture consent against a plain-language notice and generate a timestamped, auditable record. This is an engineering and UX project, not just a legal one. 

Step 4: Redesign your processing stack for child users. 
Disable tracking, behavioral monitoring, and targeted advertising for all accounts identified as belonging to children under Section 9(3). Verify that recommendation engines, engagement algorithms, notification systems, and behavioral analytics do not operate for identified child-user segments. 

Step 5: Evaluate against the Section 9(2) well-being standard 
Review all product features-engagement mechanics, gamification, notification cadence, monetization flows, and variable rewards against the questions. 

Step 6:  Document everything 
Maintain records of age verification outcomes, parental consent (including the exact notice shown and the consent response), and ongoing evidence of compliance with the Section 9(3) triple-lock prohibitions. This documentation is your primary defence in an enforcement inquiry. 

Children’s Data Compliance Checklist 

Assessment 

  • User base audit completed – realistic assessment of children’s use documented 
  • All processing activities mapped separately for child-user data 
  • Section 9(2) well-being evaluation conducted for all product features

Technical Infrastructure 

  • Age gating mechanism implemented at registration 
  • Parental verification flow built: Pathway 1 (existing verified account) or Pathway 2 (DigiLocker under Rule 10(1)) 
  • Consent capture system generates timestamped, auditable records 
  • Section 9(3) tracking, disabled for child accounts 
  • Section 9(3) behavioural monitoring, disabled for child accounts 
  • Section 9(3) targeted advertising, disabled for child accounts 
  • Recommendation and engagement algorithms confirmed non-operational for child accounts

Legal and Documentation 

  • Plain-language parental consent notice drafted and legally reviewed 
  • The exemption scope assessed, and healthcare and educational purposes clearly bounded in documentation 
  • Third-party vendor data processing agreements reviewed for child-data clauses 
  • Parental consent records retention policy defined  

Ongoing Governance 

  • Privacy-by-design review process implemented for all new features involving child users 
  • Regular audit of child-account processing against Section 9(3) prohibitions 
  • Board-level awareness of Section 9(2) well-being obligation documented

Closing Note: The Work Ahead 

There is, to put it plainly, a great deal still to figure out. 

The Data Protection Board is newly operational, SDF designations have not yet been issued, sectoral exemptions are still being determined through government notifications, and the precise contours of obligations like “verifiable parental consent” and “detrimental effect on well-being” will only be sharpened through enforcement decisions, adjudications, and eventual judicial interpretation. Indian businesses, especially startups are being asked to build compliance infrastructure around a framework that is still resolving its own operational details in real time. 

That is not an excuse for inaction. If anything, it is the argument for acting now, while the window is open: mapping your data flows, auditing your user base, restructuring your consent mechanisms, and building the governance foundations. 

For guidance on building child-safe data practices into your product architecture from the ground up, The Startup Zone works with startups to design DPDP-compliant systems that do not sacrifice product quality for legal safety. 

Next in series: DPDP Penalties and Enforcement, How the Data Protection Board Works and What Happens When You Fail to Comply 

Connect With Our Experts

Post View Counter
Post Views : Loading...