The Missing Element Of DPDP Rules: Privacy

"A data protection law should not just govern whether consent is given, it should regulate how it is framed"

Data is power. But in the digital world, the people generating it have the least of it.

The Digital Personal Data Protection (DPDP) Act 2023 and its 2025 draft rules were supposed to give Indians greater control over their personal data. In theory, it does. It lays down rules on consent, data security, and the right to withdraw personal information. But for a law that claims to empower users, it does surprisingly little to stop companies from using dark patterns to give up their data in the first place.

The Act assumes that if a user consents to data collection, the process is fair. But anyone who has used an Indian e-commerce app, fintech service, or social media platform knows that consent online is rarely given freely; it is extracted, nudged, or even tricked out of users.

The DPDP Act, in its current form, does not adequately address these manipulative practices. It regulates how data is handled, but not how and how much of it is taken.

(Illustration from ASIA’s 2025 policy note on how DPDP rules allow manipulative consent design. Image by ASIA, Policy Note: Analysing DPDP rules 2025)

The biggest flaw in the DPDP Act is its transactional approach to privacy. It treats consent as a one-time legal agreement, a checkbox that once ticked, grants companies near total freedom to collect and use personal data.

But privacy is not just about whether a company collects data, it is about how, why, and under what conditions. Users should not have to fight to protect their privacy, but the regulation may place the burden on them.

A popular shopping app shows a cookie banner. The “Accept all” button is large and highlighted. To reject, you have to click “Settings”, then uncheck multiple boxes one by one. It’s easier to say yes than to protect your data.

The law treats this as valid consent, but it’s anything but fair. Platforms are free to design tricky, deceptive interfaces that push users toward sharing data while making refusal difficult. 

A key provision of the DPDP Act is that withdrawing consent should be as easy as giving it. The intent is clear: users should be able to take back control of their data without unnecessary hurdles.

But does this happen in practice? Not really. Many platforms comply with this rule on paper but use design tactics that frustrate and delay users who want to opt out. The difficulty is intentional. Companies design exit processes to be inconvenient, knowing that most users won’t follow through.

 

(Meta’s 2023 “consent” wall forces users to accept data tracking or pay, an example of how dark patterns turn consent into a choice users can’t really refuse. Source: @johnnyryan on X, November 2023)

In the 2024 report by the Advanced Study Institute of Asia, we found that nearly all top 25 apps employed at least one form of deceptive design. They are the norm. They are found everywhere in India’s digital landscape, like Zepto, Swiggy, Ola, Netfllix and more.

A few ways these tactics are used in subtle ways. For instance, when users sign up for apps, they are often automatically enrolled in data sharing agreements, marketing messages, and tracking mechanisms. These settings are rarely explained clearly, and opting out requires effort.

Another widespread tactic is basket sneaking, where extra charges, like delivery insurance, platform fees, or tips, are automatically added at checkout. If a user isn’t paying attention, they end up paying for services they never actively selected. These aren't just local issues. Even under GDPR, global platforms have continued to use dark patterns: Amazon fined € 35 million, Facebook € 60 million, Google € 150 million and TikTok € 5 million.

The DPDP Act makes no explicit mention of these deceptive practices, leaving a massive loophole for companies to continue extracting data through design, rather than genuine user agreement. A data protection law should not just govern whether consent is given, it should regulate how it is framed.

Right now, the Act allows companies to shape user decisions while technically complying with consent rules. This is where the line between persuasion and manipulation blurs. Platforms are not merely guiding users, they are overwhelming them with urgency, visual noise, and engineered defaults.

According to the same report, even colour choices and interface layout are optimised to drive reactive behaviour. Consent, in this ecosystem, is not a meaningful choice; it’s a designed outcome. This weakens the entire framework.

Platforms relying on AI-based personalisation also face operational hurdles. Once user data is embedded in learning models, deleting it isn’t simple. Small businesses may not have the infrastructure to manage deletion, tracking, or notices, raising serious compliance and cost issues.

Governance becomes patchy, uneven, and hard to enforce. Some argue self-regulation is enough. But self-regulation only works when the rules are clear, incentives are aligned, and bad actors face real consequences. Right now, none of that exists. Good actors are left without guidance, and the rest continue to collect data using dark patterns.

(Farheen Yousuf is a Policy & Trust Analyst and Shivani Singh is a Programme Coordinator for Law & Critical Emerging Technologies at the Advanced Study Institute of Asia-ASIA. Views are personal)

This is a free story, Feel free to share.

facebooktwitterlinkedInwhatsApp