THE STATE CABINET OF TELANGANAis proposing to enact the Telangana Gig and Platform Workers (Registration, Social Security and Welfare) Bill, 2025 mandating transparency in the algorithmic management of gig workers. The law would prohibit arbitrary termination, requiring a seven-day notice period before deactivation except in safety-related cases. By compelling algorithmic disclosure in task allocation, bonuses, and ratings, Telangana’s proposed framework is expected to become India’s most comprehensive legislative measure for gig worker protection, surpassing similar initiatives in Karnataka, Jharkhand, and Rajasthan.
This state-level intervention reveals two crucial insights. On one hand, it suggests that contemporary labour protection increasingly entails the regulation of technology, and on the other hand, it underscores that the national law governing technology and data, namely, the Digital Personal Data Protection (‘DPDP’) Act, 2023 does not incorporate labour law principles to harmoniously protect the interest of gig workers, arguably, the most vulnerable stakeholders in India’s expanding digital economy.
Telangana’s proposed framework is expected to become India’s most comprehensive legislative measure for gig worker protection.
The Origin and Promise of the DPDP Act
The DPDP Act was one of the many major outcomes of the Puttuswamy Judgement (2017), which held that privacy is a fundamental right rooted in human dignity and autonomy. The Act contains suggestions made by the Srikrishna Committee Report, incorporating the principles of free and fair consent, data minimization and mechanism of data fiduciaries and significant data fiduciaries
This paper argues that the Act fails to protect the most vulnerable actors in the digital economy: gig-workers. While the Act incorporates the graded nature of privacy and data protection, it does not address data protection requirements for the gig workers. Further this essay aims to elaborate on different data protection measures required within the DPDP framework to safeguard the interests of gig workers.
The illusion of consent in platform work
Section 6 of the DPDP Act specifies that “consent” has to be free, specific, informed and unambiguous. However, these standards do not apply to gig workers since they are bound by stringent contractual prescriptions. For instance, when a gig worker clicks on “I agree” to the terms and conditions of an aggregator platform like Ola or Uber, they agree to the aggregator's absolute control over both the labour process and personal data. The platform uses data for profiteering businesses in the market.
This makes gig workers more vulnerable than consumers. While a consumer logs into Netflix or Facebook to access a service, a gig-worker logs into Uber or Swiggy to access means of livelihood. This dependency for livelihood warrants more protection for gig workers.
This is where labour law and data protection law intersect. Labour law equates power imbalances between employers and workers through the collective bargaining framework. This ensures that the voices of workers are heard and social power of the employer is intervened through protective measures. The absolute contractual positioning of gig workers creates a disguise of fair consent. Data protection law is supposed to balance the asymmetry in information between the individual and the data processors. But the DPDP Act does not look at the asymmetry in information and bargaining power between the aggregator and the gig worker. Therefore, its reliance on the consent of the gig workers makes them more vulnerable.
Why the ‘Data Fiduciary’ model fails Gig Workers
Academics M.J. Taylor and J. M. Paterson have hailed the data fiduciary model as an alternative to the consent-centric model of data protection. According to former model, aggregators should act in fiduciary capacities, handling the data of gig workers for their benefit. The fiduciary capacity of the aggregators imply that platforms are not mere intermediaries collecting and processing data but have an absolute obligation to ensure fairness and transparency in handling data of the workers.
However, the fiduciary model fails in the context of gig workers. The power asymmetry between the aggregators and gig workers compels the workers to agree to any form of control over their data and therefore, the concept of fairness is flawed in such a relationship from the beginning. Moreover, aggregators as profit oriented corporate entities function on extraction and commercialisation of data. This contradicts their obligation to be a fair data fiduciary of the gig workers.
‘Data Minimization’ in theory and practice
The data fiduciary model is centered on the principle of data minimization accepted in both the General Data Protection Regulation (‘GDPR’) and the DPDP Act. The principle of Data Minimisation prohibits indiscriminate and unlimited use of data for any purpose. In thee landmark case of Maximilian Schrems v Meta Platforms Ireland Ltd (2024), the Court of Justice of the European Union explained that this principle precludes any personal data from being aggregated, analysed and processed for any purpose without restrictions of time and data types.
While in theory, the principle of data minimization is a good procedural safeguard, in practice gig worker’s data collection is pervasive. Gig worker’s daily routine is surveilled and the data collected is used and often retained post-contract termination for targeted advertising or street mapping. Take Uber’s privacy policy, for instance, which allows third parties like Meta to access worker data for targeted advertising, without notifying the worker each time such data is shared.
It is for this reason that the issue of additional protection of worker’s data was raised in the ILO Code of Practice in 1997 which noted that workers’ data should be processed only for “reasons directly relevant to the employment...” In practice, this would mean that gig workers should have the right to refuse data collection, or transfer of data to third parties for purposes not directly connected to their work. What data is collected, how much and how long should be determined by what is directly relevant to employment.
Some contemporary frameworks apply these principles in limited form. For instance, the California Consumer Privacy Rights Act, enacted in 2020, extended to employees the right to request information about the personal data collected and used by their employers, a right previously available only to consumers. This marks a move towards transparency, aligning more closely with the ILO’s 1997 Code and the fiduciary conception under GDPR. This is a best practice that can be transplanted into the DPDP Act.
Gig worker’s daily routine is surveilled and the data collected is used and often retained post-contract termination for targeted advertising or street mapping.
DPDP and Algorithmic Management
To understand working conditions within the platform economy, we need to look at the mechanics of remote-control. Algorithmic Management is what makes it possible. It refers to automated or semi-automated decision-making tools which rely on data collection and surveillance of workers. It determines who gets the tasks, earns bonuses, or even be subject to penalties or removal from the platform. These decisions are, in absence of any regulation, highly opaque, swift and unexplained.
The GDPR acknowledges the risks of such practices by providing for the right not to be subject to a decision based solely on automated processing which significantly affects individuals (under Article 22(1)). However, this right includes exceptions for contract performance, legal authorisation or explicit consent (under Article 22(2)).
For gig workers, relying on consent defeats the core logic of this protection. The ILO Code, from the lens of worker protection, removes the exceptions to declare that “decisions concerning a worker should not be based solely on the automated processing of that worker’s personal data” and that “personal data collected by electronic monitoring should not be the only factors in evaluating worker performance” (under Articles 5.5 and 5.6).
The recent EU Directive on Improving the Working Conditions of Platform Workers, 2024 further reinforces this protection by establishing the first EU-level rules on algorithmic management in the workplace. It requires transparency, human oversight, and accountability in automated decision-making systems used by platforms, thereby operationalising Article 22 GDPR principles in platform work. By introducing a presumption of employment and regulating the use of artificial intelligence in managing workers, the Directive moves closer to the protective spirit embodied in the ILO Code, rejecting the idea that automated decisions and opaque algorithmic evaluations can dictate worker’s livelihoods.
The DPDP Act on the other hand contains no comparable safeguards against automated decision-making or algorithmic control. For gig workers, this absence means their pay, task allocation, and working conditions are governed by opaque systems that remain beyond their challenge. This leaves a fundamental feature of their work relationship unaddressed by India’s data protection framework.
Gig workers stand at the centre of this new economy, their data serving simultaneously as the means and the product of work.
Conclusion
The DPDP Act recognises a graded scheme of data protection, prescribing higher thresholds of protection based on the nature of data and the vulnerability of the data principal. Yet, its framework, designed around the consumer/data-subject binary, fails to respond to the evolving nature of work and value creation in the digital economy.
Gig workers stand at the centre of this new economy, their data serving simultaneously as the means and the product of work. However, the law continues to treat them merely as individual data principals, ignoring the structural dependence and power asymmetry that characterise their relationship with digital platforms.
The Act’s consent-based model and its fiduciary framework are inadequate to address the realities of algorithmic management where data-driven control determines work allocation, evaluation, and termination. The absence of safeguards against automated decision-making within the DPDP Act stands in sharp contrast to global standards under the ILO Code of Practice, 1997 and the EU Directive on Improving the Working Conditions of Platform Workers, 2024. Telangana's proposed legislation demonstrates the state’s attempt to regulate algorithmic control against arbitrary termination and mandatory disclosure in task allocation. These developments reveal an emerging consensus: that contemporary labour protection must necessarily entail the regulation of technology.
It is in this context, it is argued that the DPDP Act must move beyond abstract consent-centric model towards a labour-sensitive approach that embeds labour protection within data protection. The equitarian approach of labour law cannot be negated when the implications of the DPDP Act have close connection with the question of autonomy of workers. The Telangana Bill shows that protecting gig workers in the platform age means safeguarding their work, dignity, and their livelihood through regulation of algorithms. Only then we can ensure that neither the gig worker nor their data remains invisible in India’s digital economy.