Explainer

India’s new Digital Personal Data Protection Rules, 2025: A Detailed Reading

The final DPDP Rules are detailed, comprising 23 rules and seven schedules, and cover a wide range of topics, including consent notices, data breach protocols, and the powers of the new Data Protection Board. Here’s our breakdown.

ON NOVEMBER 13, 2025, the Government of India notified the Digital Personal Data Protection Rules, 2025 (‘DPDP Rules’), under the enacted Digital Personal Data Protection Act, 2023 (‘the Act’). This long-awaited step comes two years after the Act’s passage and marks a landmark moment in India’s digital privacy regime. The Rules translate the Act’s broad mandates into concrete procedures for businesses, government agencies, and data principals (individuals). They arrive at a crucial moment - as India’s digital economy and services increasingly peg on personal data - and under the shadow of landmark privacy jurisprudence. The Supreme Court’s K.S. Puttaswamy (Retd.) vs Union of India (2017) judgment recognised privacy and informational self-determination as fundamental rights. In that spirit, the DPDP framework seeks a balance between individual autonomy and legitimate public and commercial uses of data. The Rules will determine how effectively that balance is struck in practice.

The final DPDP Rules are detailed, comprising 23 rules and seven schedules, and cover a wide range of topics, including consent notices, data breach protocols, and the powers of the new Data Protection Board. They build on a draft published on January 3, 2025 (which drew extensive comments) and reflect some public input: the Gazette makes clear that “objections and suggestions” on the draft were “considered” before finalisation.

Framework and Commencement

The Rules come into force in phases. By design, Rules 1, 2 and 17-21 take effect immediately upon notification; Rule 4 (consent manager registration) kicks in after one year; and the remaining provisions (notably Rules 3, 5-16, and 22-23) apply after an 18-month transition. In other words, basic structures (like the Board’s constitution, definitions, and a mandate for digital processes) start now, while most fiduciary obligations are deferred to mid-2027. This staggered timeline gives organisations time to adapt, but also means that legal safeguards for data principals roll out gradually.

Rule 1 formally names the regime: “These rules may be called the Digital Personal Data Protection Rules, 2025.” Rule 2 provides a glossary (e.g. defining “techno-legal measures” and “user account”), but largely the Act’s definitions prevail. Notably, “verifiable consent” is defined by reference to the later rules. With these preliminaries in place, the substantive obligations take effect, beginning with notice and consent protocols.

The Rules translate the Act’s broad mandates into concrete procedures for businesses, government agencies, and data principals (individuals).

Notice and Consent

At the core of the DPDP framework is consent. The Rules prescribe how data fiduciaries (entities determining the purpose and means of processing) must obtain and document consent from data principals. Rule 3 dictates the contents and form of the notice given to a principal. The notice must be self-contained and clear, not buried in lengthy terms of service. 

In plain language, it must list at a minimum:

Data categories and purpose: An itemised description of the personal data to be collected or processed, and the specific purpose(s) of such processing. For example, if an app collects location and health data to offer personalised services, both the exact data fields and how they serve the service must be explicitly spelt out.

Service description: A clear description of the goods or services enabled by the processing. This prevents vague boilerplate like “to improve user experience.”

Withdrawal link and rights: The notice must specify a direct communication link (website, app, etc.) by which the principal can withdraw consent. The ease of withdrawal must match the ease of giving consent. The notice must also tell the principal how to exercise her rights under the Act (access, correction, etc.) and how to lodge complaints with the Board.

Rule 3 builds transparency into consent: principals should have a fair, digestible account of what they’re agreeing to, without hunting through multiple documents. This aligns with global practice: for instance, the GDPR similarly mandates that information be presented concisely and clearly at the time of data collection.

Consent Manager Regime

A unique feature of India’s framework is the concept of a Consent Manager (Rule 4). This envisions an independent intermediary platform that helps data principals give and manage consent across multiple services. Rule 4(1) empowers an entity meeting certain criteria (set out in the First Schedule) to apply to the Data Protection Board for registration as a Consent Manager.

Key conditions (Part A of Schedule 1) include: incorporation in India; substantial net worth (at least ₹2 crore); adequate technical, operational and financial capacity; and sound management credentials. Thus, only financially robust Indian companies can be consent managers. On registration, the Board publishes the manager’s particulars.

Part B of Schedule 1 then imposes strict obligations on registered Consent Managers. These include:

Data-blind processing: A Consent Manager must enable a principal to give or withdraw consent without itself reading or retaining the underlying personal data. In other words, it merely routes consents or tokens between parties, acting as a neutral “switchboard.”

Audit trail: It must keep (and share with the principal) a record of all consents given, denied or withdrawn, and of the notices associated with each consent. These records must be maintained for at least seven years.

Security and accountability: It must take reasonable security safeguards to prevent data breaches. It must act in a fiduciary capacity to principals and avoid any conflict of interest with data fiduciaries. For example, a consent manager’s directors or shareholders cannot simultaneously hold controlling stakes in client companies (without Board approval).

Transparency: It must publish on its platform information about its owners, directors, major shareholders and other data (subject to Board directives), and make its service available via a website/app as the “primary means” for consent transactions.

No subcontracting: It cannot outsource its core obligations.

Audit and oversight: It must conduct internal audits of its controls, procedures and compliance, and report the results to the Board periodically.

These strictures aim to ensure that consent managers are strong, neutral actors that enhance individual control. They mirror, in spirit, elements of the EU’s accountability principle (Article 5(2) GDPR), though the Indian rules are more prescriptive about provider structure and conflicts. The Rules thus create a layered consent architecture, but one that will require the Board to carefully vet applicants and enforce Part B obligations over time.

Key conditions (Part A of Schedule 1) include: incorporation in India; substantial net worth (at least ₹2 crore); adequate technical, operational and financial capacity; and sound management credentials. 

State Processing and Security

The DPDP Act explicitly recognises that the State, acting through legislation or policy, may process personal data to deliver benefits. Rule 5 implements this by requiring such state-driven processing (e.g. issuing subsidies, licenses, certificates) to comply with standards in the Second Schedule

Those standards echo basic data protection principles: the data must be used lawfully and only for the stated purpose (e.g. specific welfare scheme); collection should be limited to what is necessary; accuracy must be maintained; retention must not exceed the need or legal requirement; and appropriate security safeguards (encryption, access controls, etc.) must be in place. In essence, when public authorities handle personal data, they must act lawfully, transparently, and securely - consistent with constitutional guarantees.

Rule 6 imposes a general security safeguard duty on every data fiduciary. It requires “reasonable security safeguards” to prevent breaches. At minimum, this includes: encrypting or tokenising data; strict access control to computers and networks; maintaining logs and monitoring access for intrusion detection; retaining logs and data backups for at least one year to investigate breaches; and contractual safeguards for any third-party processors. 

These provisions are quite granular. For example, the Rule explicitly calls for “virtual tokens” or masking of data to protect it, and mandates one-year log retention (unless a law requires longer). By comparison, many other laws simply require “appropriate technical and organisational measures” without specifying exactly what. India’s Rules clearly want concrete measures in place.

Data Breach Reporting

A cornerstone of modern privacy laws is timely breach notification. Rule 7 brings a robust regime - on becoming aware of any personal data breach, a data fiduciary must promptly inform each affected data principal “in a concise, clear and plain manner.” The notice must describe the nature and timing of the breach, its likely consequences, and the steps already taken or to be taken to mitigate harm. Critically, it must also provide business contact details of a person who can answer follow-up questions for the affected principal.

At the same time, the fiduciary must notify the Data Protection Board. Within 72 hours of knowing about the breach, it must furnish the Board details including: the breach description (nature, extent, timing, location), its likely impact, the steps taken or proposed to mitigate risk, and findings on the cause or perpetrators. 

If 72 hours cannot be met, the Board can allow more time upon request, but reasons for delay must be given. This is broadly comparable to the GDPR’s Article 33 (which likewise mandates notifying authorities “not later than 72 hours” after awareness), and it is stronger than the GDPR’s Article 34 framework that only requires informing principals if there is a high risk. Here, every affected individual must be notified “without delay”, suggesting a very low threshold.

For example, the Rules do not carve out “insignificant” or anonymised breaches from notification. Any personal data breach will trigger the duty. This could be burdensome for large-scale systems (imagine minor glitches), but it underscores the priority placed on individual awareness. Overall, the breach rules signal that India intends to foster prompt transparency with victims.

For instance, an e-commerce entity with over 20 million users must erase a customer’s data three years after the customer last engaged, unless the customer logs in to maintain the relationship.

Data Retention and Erasure

Rule 8 deals with retention. It instructs certain classes of data fiduciaries to erase personal data when it is no longer needed. If a fiduciary in the categories listed in the Third Schedule stops hearing from a principal or fulfilling the purpose, it must erase the data unless law requires longer retention. In practice, the Third Schedule sets fixed “clock” periods for different sectors (e-commerce platforms, social media, online gaming, etc.) - typically three years from the last user interaction or from the Rules’ commencement, whichever is later.

For instance, an e-commerce entity with over 20 million users must erase a customer’s data three years after the customer last engaged, unless the customer logs in to maintain the relationship. In addition, all fiduciaries must keep processing logs and traffic data for at least one year (post-processing) for accountability and audit. Crucially, Rule 8 requires that the fiduciary must warn the principal 48 hours before erasure, giving the user a chance to intervene and continue the purpose or exercise rights.

These retention rules are more prescriptive than the GDPR, as the GDPR mandates that personal data be “kept in a form which permits identification” no longer than necessary (Article 5(1)(e)), but leaves specific periods to data controllers’ judgment (subject to other laws) rather than carving out fixed terms. India’s approach, using class-based schedules, provides clarity but reduces flexibility. It will require principals and fiduciaries to track these timelines carefully.

Data Principal Rights and Grievances

The Rules also operationalise rights given by the Act (such as access, correction, portability). Rule 9 requires every fiduciary to prominently publish on its website/app and in its communications the contact information of the Data Protection Officer or other person who can answer principals’ questions about their data. This mirrors GDPR’s Article 13’s requirement to name a contact person.

Rule 14 focuses on how principals exercise rights. While Rule 14(1) mandates fiduciaries (and consent managers) to publish the means by which a principal may make a rights request (along with any identifiers needed like  account ID or mobile number), Rule 14(3) requires a grievance redressal system with a standard timeframe: all data fiduciaries and consent managers must commit to resolving principal grievances within 90 days, and must publish this timeline. Finally, Rule 14(4) allows a principal to nominate a representative (or multiple representatives) to exercise their rights on their behalf, using the same contact methods and identifiers.

In practical terms, this means a platform must clearly tell users: how to submit an access request , how to file complaints, and expect an answer in at most 90 days. If a user cannot act personally (due to illness, for example), she can appoint someone. These obligations place on fiduciaries the burden of customer-friendly privacy workflows. 

For comparison, GDPR similarly provides a right to appoint an agent, and sets a one-month deadline (though extendable) for handling requests. India’s 90-day window is somewhat more lenient.

The Rules aim to make individual rights real. Every notice and communication must tell users how to use their rights, and boards/companies must have processes in place to handle them. The Rules do not, however, define new rights beyond the Act - only ensuring that contact points and timelines are in place.

Special Cases: Children and Vulnerable Persons

The Rules give special protection to minors and persons with disabilities. Rule 10 stipulates that no child’s personal data may be processed without verifiable parental consent. A fiduciary must adopt technical and organisational checks to ensure that the person consenting is indeed a parent (or guardian). For example, the parent’s identity and age details must be verified from reliable records held by the fiduciary or via government-backed age/identity tokens (such as Aadhar or DigiLocker). The Rules illustrate scenarios (cases 1-4) of how a platform might enable a parent to verify herself before allowing a child to use the service. 

In essence, mere parental declaration is insufficient: the service must verify that the adult is who she claims. This is broadly aligned with Article 8 of GDPR on child consent, though the technical token-based verification is a local innovation in India’s ecosystem.

Rule 11 extends the consent requirement to persons with certain disabilities who cannot decide for themselves. If someone identifies as a lawful guardian of a person with disability, the fiduciary must check that such a guardian appointment is court-certified or by a designated authority under disability laws. No data may be processed until this verification is done. 

There are also exemptions carved out for children’s data. Rule 12 notes that the restrictions of Sections 9(1) and 9(3) of the Act (which generally prohibit processing children’s data without the above consents) do not apply when a child’s data is processed by certain entities or for certain purposes, subject to conditions. 

These carve-outs are spelt out in the Fourth Schedule.

Part A (Data Fiduciaries): For instance, a clinical establishment, mental health hospital or healthcare professional may track or monitor child data if it’s strictly for providing healthcare to the child (like tracking vital signs). An educational institution may track a child’s behavior or location if it’s for education or safety. Even an individual caring for children in a daycare is exempt from the strict consent rule if using data for child safety. In each case, the processing is limited to what’s necessary.

Part B (Purposes): Separately, certain purposes override consent requirements. For example, any action by the State “in the interests of a child” under law (e.g. child welfare, custody issues) can involve child data. Further, providing government subsidies or services in a child’s interest is allowed. There are also narrow exemptions like creating a child’s email account for communication, or tracking a child’s real-time location for safety.

This approach recognises that in contexts like healthcare, education or child safety, subjecting every data point to parental consent is impractical. However, these exemptions must still be “necessary” and limited to their protective purpose.

Even an individual caring for children in a daycare is exempt from the strict consent rule if using data for child safety. 

Significant Data Fiduciaries

A unique feature in India’s law is the Significant Data Fiduciary (akin to “large platforms”). The rules add extra obligations for them. 

Rule 13 mandates that each Significant Data Fiduciary must, annually, conduct a Data Protection Impact Assessment (‘DPIA’) and a compliance audit. The findings must be reported to the Board, highlighting any gaps. Moreover, these major fiduciaries must “observe due diligence” that their technical measures (including algorithms) do not endanger principals’ rights. In effect, tech giants and major digital services will have to audit themselves and explain how their systems protect privacy.

By comparison, the GDPR requires mandatory DPIAs for high-risk processing (under Article 35), but does not prescribe annual audits or Board reporting. India’s approach is more hands-on and periodic. The effectiveness of this rule will depend on the Board’s follow-up: will the Board critically review the DPIAs or simply file them? In any case, the Rules signal a focus on oversight of dominant players, a recognition that scale often entails greater risk.

Transfers and Exemptions

Under Rule 15, personal data may leave India’s borders only if the data fiduciary meets conditions as specified by the central government. These conditions may include, for example, adequacy certifications or contractual safeguards (though the Rules leave details to future orders). 

Notably, Rule 15 mirrors Section 17(3) of the Act, preserving strict territorial control. This is a significant difference from GDPR: the EU allows data transfer outside under defined conditions (adequacy, standard clauses, binding rules), but Indian law puts the onus on the government to later form those conditions.

Rule 16 echoes the Act’s broad research and archival exemption. It provides that the Act’s obligations do not apply to personal data processing for research, archiving, or statistical purposes, if done according to the Second Schedule standards. In other words, researchers may process data without individual consent as long as they use “appropriate technical and organisational measures” to protect it. This exemption reflects a policy choice to promote non-commercial scientific use, akin to some conditions in GDPR Recital 159.

The Data Protection Board and Governance

Rules 17-19 lay out the Data Protection Board’s constitution. A Search-cum-Selection Committee headed by the Cabinet Secretary (with the Law and MeitY secretaries and two experts) will recommend a Chairperson. A similar committee, led by MEITY’s Secretary and including legal and expert members, will recommend the other four members. The government then appoints them. Thus, both the Chair and members are chosen through a predominantly government-led process.

The Data Protection Board of India shall consist of four members. Members’ salaries and perks are fixed by Schedule 5: the Chair gets ₹450,000/month and members ₹400,000. The Board’s procedures (under Rule 19) follow standard collegial norms: quorum is one-third, decisions by majority (with the Chair casting a tie-break vote). The Board can also adopt “techno-legal measures” to conduct all its business digitally (under Rule 20), reflecting the Act’s directive for a fully electronic office.

The Board can hire staff (with government approval) and set their terms by Sixth Schedule (under Rule 21). Appeals from Board orders go to a newly created Appellate Tribunal (under Rule 22). Appeals must be filed digitally, with a fee similar to that of telecom appeals, payable via UPI. The Tribunal is not bound by strict civil procedure; it will follow natural justice and its own rules and will also be a “digital office” with e-hearings allowed. These provisions aim to expedite cases and reduce paperwork (a nod to modernising the justice system), though the Tribunal’s real test will be staffing it with competent judges and avoiding delays.

If the government believes disclosure of its demand would harm sovereignty or security, it can bar the fiduciary from informing the data principal about that demand.

Finally, Rule 23 broadens government powers. It allows the Central Government to direct any Data Fiduciary or intermediary to “furnish such information” for certain Act-specified purposes. The Seventh Schedule clarifies those purposes: data use by the State in the interest of national security or integrity, fulfillment of legal obligations, or designating significant fiduciaries. In effect, the State can require disclosure of personal data for sovereignty, security, or public duty, subject to applicable law. 

Alarmingly, Rule 23(2) goes further: if the government believes disclosure of its demand would harm sovereignty or security, it can bar the fiduciary from informing the data principal about that demand. This means a minister could secretly order the release of your data and forbid the company from even informing you that it had happened.

Such powers raise legitimate concerns. There is no requirement in the Rules for judicial oversight of these information demands. Nor is there a stated requirement that any such state processing meet constitutional standards (beyond “applicable law”). Any government processing of personal data must align with the constitutional safeguards prescribed in Puttaswamy - but the Rules themselves do not spell out how that alignment will be ensured. This leaves a major gap: without clear checks (judicial warrants, ex post review), Rule 23 risks being an open-ended permit for surveillance or data requisition.

Comparative Reflections

India’s DPDP Rules show both convergence with and divergence from global norms. Like the EU’s GDPR, they emphasise accountability and technical safeguards: encryption, access controls, breach logs, regular DPIAs, etc. The breach notification regime is explicitly modelled on GDPR’s 72-hour rule, although India goes further by obligating notifications to individuals for all breaches (GDPR requires it only for “high risk” breaches). Both frameworks require data controllers to document their compliance and to allow data subject requests. For instance, GDPR Article 15 guarantees a right of access (the data subject shall have the right to obtain from the controller confirmation as to whether personal data are being processed, and access to that data), and it enumerates the details that must be provided. The DPDP Act likewise gives access and correction rights (though the Rules focus on practical procedures for those rights rather than restating them in full). The “right to be forgotten” under GDPR (Article 17) - deleting data “no longer necessary” - finds a rough echo in Rule 8’s erasure timelines.

Yet there are sharp differences too. GDPR imposes very high fines for non-compliance (up to €20 million or 4 percent of global turnover). By contrast, the DPDP Rules (and Act) rely more on remedial directions by the Board and relatively modest penalties. India’s framework appears less punitive, focusing on reinforcing compliance without imposing substantial penalties. 

This raises questions about enforcement “teeth”: if fines or sanctions are weak, will companies view compliance as optional? The Board’s power to audit and suspend registration exists on paper, but effective deterrence requires hard financial consequences.

On data localisation, India’s stance is also more restrictive as the DPDP Act generally forbids data transfer without government-approved conditions (a form of soft localisation), whereas GDPR allows transfers provided safeguard mechanisms (adequacy decisions, standard contractual clauses, binding corporate rules) are in place. Rule 15 thus reflects India’s insistence on greater national control. If global companies seek to move data across borders, they will have to await India’s yet-to-be-defined rules.

The Rules’ treatment of children also follows a global template: parental consent, verification of age, and carve-outs for child-benefit programs. One gap is algorithmic decisions. While GDPR requires notification if automated profiling affects individuals (under Article  22), the DPDP Act and Rules are silent on automated profiling or “explainability” of algorithms. This is a missed opportunity, given global concerns over AI’s impact on rights.

On institutional design, India’s Board will be digital and empowered, but it remains to be seen whether it will be truly independent. Unlike GDPR’s independent Data Protection Authorities in each Member State, the Indian Board’s members are executive appointees . There is no requirement that they be full-time or immune from external pressure. The Board must strive to develop a culture of impartiality, backed by enough resources to scrutinise both private and government actors.

Gaps, Risks and Implementation

Putting this all together, several gaps and challenges emerge. First, notice and consent assume informed individuals; but India’s literacy and tech fluency gaps are real. Will long consent forms and notifications (even in plain language) actually empower average users? The Rules require “clear and plain” language, but do not prescribe any standardised template. Without user-friendly design (like layered notices), organisations may still bury complex terms beyond the average person’s understanding.

Second, the Consent Manager regime is ambitious but untested. It could increase portability of consent across apps, but it also adds bureaucracy. Only highly capitalised firms can be managers, which may limit competition to a few large players or government-run platforms. Data principals will need to trust these intermediaries entirely (since they hold consent records). Any conflict-of-interest could undermine neutrality. The Board will need to vet applications carefully and enforce Part B obligations strictly, or else consent managers could morph into surveillance hubs under industry influence.

Third, when it comes to grievance redressal under, while a 90-day turnaround is generous on paper, firms and government entities have been slow with RTI and consumer complaints. We must scrutinise whether the Board enforces this timeline or allows extensions. Also, while a principal can appeal Board orders (under Rule 22), that tribunal is new and untested. Its procedures are weaker than traditional courts, and it will need technical expertise and independence to resolve data disputes fairly.

Fourth, the powerful State access provisions pose a deep normative tension. On one hand, the Act explicitly permits state processing for welfare and security, and on the other, privacy as a fundamental right requires strict judicial safeguards for surveillance or data requisition. The Rules’ silence on checks (Rule 23(2) even allows secrecy orders) is troubling. Courts may have to fill this gap. The Rules do not operationalise the test laid down in the Puttaswamy judgment, that any interference with privacy must meet tests of legality, necessity, and proportionality. If the Home Ministry demands call records under Rule 23, does it need a warrant? The Act suggests it does (“any law…”) but the Rules do not specify. This creates uncertainty and potential abuse. 

From an implementation standpoint, many businesses, especially small and medium enterprises, will struggle to build compliance frameworks. They must appoint data protection officers, set up security measures (encryption, intrusion logs, backups), conduct DPIAs (if significant), report breaches quickly, etc. This is easier said than done for firms lacking legal departments or IT teams. The Rules do not offer a light-touch option for startups (unlike some GDPR leniency for micro-enterprises). The government may need extensive guidance, templates and support (perhaps through standard contracts or codes of practice) to help compliance. 

When it comes to grievance redressal under, while a 90-day turnaround is generous on paper, firms and government entities have been slow with RTI and consumer complaints.

Looking Ahead: Reform and Ethos

In terms of reforms, the most pressing need is to strengthen institutional safeguards. The Data Protection Board must mature into a truly independent authority, meaning that it must be transparent about its own operations. Board meetings and decisions should be documented and published. Public representation or at least third-party observations could enhance legitimacy. One hopes major decisions (like fining a State agency, if ever warranted) are made by quorum consensus and not just by virtue of the appointees’ backgrounds.

Once the Board is operational, we should see consistent enforcement of penalties and ideally larger fines for serious breaches. If India’s fines remain token or if the Board never uses its suspension powers, the law’s credibility will suffer. The scope of government data use should be narrowed or at least bounded by procedural safeguards. If Rule 23 requests always bypass user notice, courts may have to intervene more often in privacy cases.

Equally important is for constitutional values to permeate data policies. Whenever governments or companies collect or retain personal data, they should ask: is this necessary, is it minimal, and what rights does the individual have? Training civil servants, police, and bureaucrats on the new Rules will be vital.

Finally, public engagement will shape outcomes. Individuals must learn about their new rights - how to ask for data deletion, how to complain about misuse? Academic and industry feedback (via MeitY’s consultative processes) should be solicited to refine the Rules. 

Conclusion

The Digital Personal Data Protection Rules, 2025, in many respects, bring Indian law into line with leading global standards, while reflecting India’s distinct priorities (like data localisation and State access). 

But a law’s true test is in implementation. That will require not just well-written rules, but an institutional ethos of transparency and accountability - echoing the constitutional promise that privacy must be upheld. In the months ahead, it will fall upon the Data Protection Board, the courts, the legislature and the governed alike to ensure that these rules do not become mere paper promises and instead offer real protection for every Indian’s personal data.