For Digital Data Protection, India Must Embrace Consultation

The Digital Personal Data Protection Act must be pragmatic, technology-driven and foster collaborative discussions among stakeholders, especially on the question of the most vulnerable demographic, India's children

The Digital Personal Data Protection Act (DPDP Act) 2023, along with its draft rules, released by the Ministry of Electronics & IT under Government of India on January 3, 2025, signifies a watershed moment in protecting individuals' privacy in the age of technology.

The rules outline a clear strategy that will serve to strengthen all sectors in the country. An important aspect of the rules is that a ‘verifiable parental consent’ (VPC) must be obtained prior to processing children’s data and this provision shows the increasing focus of the government towards protecting the interests of minors in a rapidly digitising world.

It is a universal requirement that ensures best practices and an intention to build a fortified cyberspace for impressionable minds.

The implementation of this mandate has also fuelled a larger debate about the consequences of mandatory VPCs concerning minors. There is nothing wrong with the aim of child safety, but it is worth pondering about the practicality of this provision, its legal aspects, as well as its effect on the other actors.

For this purpose, the legal context, operational aspects and expected effects of this requirement are examined, enabling a broad picture to be painted that factors in the need for child safety without compromising on effectiveness or inclusiveness.

Ambitions And Shortcomings

Section 9 of the draft rules requires data fiduciaries to get explicit consent from a parent or authorised guardian prior to processing any child's personal data. This strategy aims to establish accountability, and safeguard kids from potentially harmful activities such as behavioural tracking, targeted advertising and other invasive data collection methods.

However, one major concern is the lack of robust means for verifying the guardian's identification. Currently, regulations are mainly reliant on self-declarations, which are susceptible to manipulation.

Without means to confirm the truth of such claims, the provision may become symbolic rather than substantive. Furthermore, exclusions for fiduciaries that adhere to mandated safe processing standards create ambiguity, potentially leading to inconsistent implementation and enforcement.

It's also important to pay attention to the legal issues concerning the extent of exemptions. Without precise definitions, the concept of "safe processing standards" may be interpreted subjectively, which would compromise enforcement consistency.

Furthermore, there may be overlaps or conflicts that make compliance more difficult, because the rules don't address how they will interact with sectoral regulations like the Juvenile Justice Act, or with current legislation like the Information Technology Act of 2000.

Operational Hurdles

The requirement for VPC imposes significant operational challenges, particularly for startups and MSMEs. Implementing VPC demands collecting additional information from parents or guardians, increasing compliance burdens and inadvertently expanding the scope of data processing — a contradiction of the Act’s principle of data minimisation.

Age verification further complicates matters. The reliance on self-declared ages leaves systems vulnerable to minors, providing false information to bypass restrictions. Industries such as gaming and social media, where anonymity is common, are particularly susceptible to such loopholes.

Chitra Iyer, co-founder of Space2Grow, notes that this lack of reliable verification mechanisms risks exacerbating the digital divide, particularly for underprivileged children who may lack access to requisite resources or documentation.

For startups and MSMEs, compliance costs are daunting. Implementing verification technologies, hiring legal teams, and conducting regular audits represent significant expenditures.

While large corporations may absorb these costs, smaller entities could face insurmountable hurdles, potentially stifling innovation and deterring new entrants.

Logistics apart, VPC provision introduces even more basic questions of privacy and the propensity of government intervention in citizens' activities. It might lead to broad surveillance through requiring linking up user online activity to government IDs, which seems in conflict with ideas of minimal retention and holding data.

Further, the rule harms industries where children are key drivers of sales. Such companies depend on targeting these children in order to thrive: Education services (both offline and online), as well as entertainment enterprises.

The ban on targeted advertising for children forces companies to reorient their marketing strategies, creating immediate compliance issues. Even though stricter privacy norms will ultimately increase consumer trust, the short-term dislocations cannot be ignored.

Moreover, privacy advocates argue that linking children’s data with parental information creates new vulnerabilities. Should these databases be compromised, the resulting exposure of sensitive familial data could have far-reaching consequences, eroding trust in digital platforms. This risk underscores the need for stringent cybersecurity protocols to complement regulatory frameworks.

Charting A Balanced Path

A proper balance needs to be maintained. It must be done in a way as to safeguard children's privacy, but not by overburdening the business or restricting freedoms. Concrete methods for verifying the identity of the guardian must be included in the rules.

Documentary verification or a digital mechanism through parental authorisation tokens can work best. The cost-effectiveness of these mechanisms can be assured with the cooperation of the industries themselves.

The framework should provide for scaled compliance obligations to recognise the resource constraints of smaller businesses. Simplified consent processes and exemptions for small-scale data processing can reduce operational burdens without compromising privacy protections. 

Self-declarations should not be the basis of the draft rules, and accepted methods for verifying age should be defined. We should leverage digital infrastructure, such as Aadhaar-based authentication, to ensure adequate safeguards are in place against misuse.

Stakeholder engagement by engaging industry leaders, civil society and legal experts, is necessary for refining the draft rules. A transparent and inclusive consultation process can help address the practical challenges faced, while ensuring that the final framework has fairness and equity.

Public awareness programmes help educate parents or guardians about rights and responsibilities under the DPDP Act. Such campaigns encourage better compliance with the law, while reducing ambiguity associated with the process of consent.

A Need for Pragmatism

The draft DPDP rules represent a laudable effort to bolster data protection in India, particularly for children, who are a vulnerable demographic. However, the VPC requirement underscores the tension between privacy safeguards and practical feasibility. Without credible mechanisms to verify guardianship, the provision risks ineffectiveness and potential exploitation.

Policymakers must address these gaps by embracing pragmatic, technology-driven solutions and fostering collaborative discussions among stakeholders. By doing so, India can craft a regulatory framework that not only upholds privacy, but also promotes innovation and economic growth.

As the deadline for public feedback approaches on February 18, 2025, stakeholders must actively contribute to shaping a digital landscape that balances privacy, practicality and progress.

(Yashawardhana is a research fellow at the India Foundation. He has a BA (LLB Hons.) from Jindal Global Law School)

This is a free story, Feel free to share.

facebooktwitterlinkedInwhatsApp