Shedding light on the recent phenomenon of artificial intelligence-based recruitment tools being found to exhibit bias against women, SAKSHAM MALIK explores the question of how the Indian judiciary would deal with the impending legal challenge of such algorithmic discrimination. He does this by providing an analysis of the judiciary’s track record of adjudication in matters dealing with various sorts of economic discrimination against women.
DISCRIMINATION persists yet it evolves. Deeply embedded in patriarchy, gender-based economic discrimination in India does not break away from this pattern. In the fight for financial autonomy of women, the targets keep multiplying.
While on one hand, archaic problems like unequal access to property rights and denial of maintenance have existed for long, on the other, novel forms of discrimination like algorithmic bias are now taking centre stage.
In 2018, Amazon had to scrap its artificial intelligence (AI) based recruiting tool that exhibited bias against hiring women. Similar instances are prevalent across industries, and will soon require legal scrutiny.
In addition to the legislature, Indian courts have always had a role to play in the cause of financial equality for women. In the past, the response of higher judiciary, specifically, has been celebrated on some occasions and frowned upon on others.
In order to understand whether the judiciary is prepared for the latest threat to gender-based economic discrimination, that is, algorithmic bias, we need to analyse its past decisions. First and foremost, we analyse the nature of landmark cases decided by Indian courts pertaining to economic discrimination against women to date.
Thereafter, we will look at the trends pointing to the increasing prevalence of gender bias within AI, the manner in which it impacts economic equality, the complex legal issues that may pop up, and whether the Indian judiciary is prepared to deal with them.
How higher judiciary has dealt with cases of economic discrimination against women
Cases pertaining to economic discrimination against women have taken numerous forms over the years. The Supreme Court and various High Courts have dealt with numerous such matters that can broadly be classified under the categories of
unequal employment opportunities
unequal access to resources including property and maintenance
unfair treatment at work.
While these categories and the cases discussed thereunder are not exhaustive, they broadly represent the factual scenarios that have primarily invited judicial scrutiny in India.
We need to acknowledge that economic discrimination against women is not always direct and conspicuous, in the form of discrimination in employment and wages. By systematically limiting access to financial resources including land and maintenance, and by subjecting them to unfair treatment at work, women have long been denied financial autonomy. The courts have taken cognisance of these issues several times in the past.
The Supreme Court has upheld the right of a daughter to an equal share in the ancestral property in Hindu families, scheduled tribes in Bihar and Syrian Christian communities in Kerala. The interests of Muslim women have also come up before the apex court, and it upheld Muslim women’s absolute right to receive maintenance under Section 125 of the Criminal Procedure Code.
Further, unfair treatment or discrimination at work can take multiple forms, as can be observed in the nature of cases that have come up for adjudication before the Supreme Court of India. In the past, the court has delivered progressive judgments protecting the rights of women in multifarious cases by i) providing maternity leave to muster roll workers, ii) famously providing guidelines to prevent sexual harassment at the workplace, laying down the foundation of India’s workplace sexual harassment prevention law, iii) striking down guidelines made by Air India requiring air hostesses to retire at marriage or their first pregnancy, and iv) declaring regulations of a company which required women in its packing and labelling department to retire upon marriage as unconstitutional.
The aforementioned judgments represent a host of different types of cases on economic discrimination against women. A quick perusal of the same presents some pertinent insights.
First and foremost, most cases pertaining to equal remuneration and discrimination in hiring have usually been decided against government entities, departments and the legislative backing for discriminatory policies, with the exception of Mackinnon Mackenzie. Even though discrimination in wages and hiring is rampant in the private sector as well, cases concerning this sector are minimal.
Further, archaic issues relating to the rights of women in succession of property are still prevalent. The case of Madhu Kishwar & Ors vs. State ofBihar & Ors. (AIR 1996 SC 1864) was decided as early as 1986, and instances of denying ancestral property to women have come up before courts even before that. Although an amendment to the Hindu Succession Act in 2005 provided daughters has an equal share in ancestral property as sons, instances of denial of rights in property have come up in court as recently as last year.
An observation critical to this analysis is that the nature of cases decided by the Supreme Court represents the evolving nature of economic gender-based discrimination to a very limited extent. Issues of algorithmic bias or bias within AI are non-existent in the country’s case literature. It will be interesting to see, therefore, how the judiciary deals with these questions that are bound to come up before it in the next few years.
One of the most pressing concerns presented by algorithmic bias worldwide, including in India, is that of discriminatory hiring. Pursuant to Amazon scraping its flawed AI-based hiring system, public discourse on gender bias in algorithmic hiring started taking shape.
Organisations like the Observer’s Research Foundation and INDIAai have noted increased discrimination in the process of recruitment on the basis of sex, even when algorithms are used to minimise bias. The issue becomes particularly worrying in the Indian context, where discriminatory hiring on the basis of sex is already rampant, especially in the private sector.
A 2018 World Bank report highlighted that more than a third of all job ads in India explicitly specified the preferred gender of the candidate, which was mostly male. To be precise, 6 in 10 such jobs preferred male candidates.
Over the years, the Indian legislature and judiciary have not been particularly successful in tackling this issue in the private sector. With the advent of artificial intelligence in an already discriminatory hiring process, the task to ensure equal opportunities for women is going to become even more complex and difficult.
To inform our opinion on the higher judiciary’s future approach to cases involving issues of algorithmic bias in hiring, we need to look at its past decisions. Unfortunately, the guidance here is very limited. Algorithms or AI are primarily being used for hiring in white-collar jobs by corporations with significant financial resources.
The higher judiciary, in the past, has not provided judicial guidance on issues pertaining either to bias in AI and its repercussions or discriminatory hiring in white-collar jobs in the private sector. Therefore, we are limited to cases involving discriminatory hiring by government entities in order to form an opinion on the judiciary’s preparedness.
A 1991 judgment of the apex court and a 2021 judgment of the Kerala High Court provide some relevant insight in this context.
Supreme Court: Employer cannot seek intimate personal details of candidates while hiring
In Mrs. Neera Mathur vs. Life Insurance Corporation of India (AIR 1992 SC 392), the petitioner applied for the post of assistant at LIC. During her recruitment, she had to file a declaration seeking details of pregnancy, marital status, menstrual cycle, conception, last delivery, abortion and miscarriage. Additionally, she had to also undergo a medical examination. During her probation period, she took a maternity leave to deliver a baby. During her leave, she was discharged without cause.
She legally challenged this decision. After the High Court refused to interfere with the termination, she applied to the Supreme Court.
LIC claimed that she was discharged because firstly she gave a false declaration during recruitment, and secondly her work during probation was not satisfactory.
The court directed LIC to put the petitioner back to service, observing that: no material on record indicated that the petitioner’s work was not satisfactory. Further, it directed deletion of columns in the disclosure regarding pregnancy, menstrual cycle, and abortions and so on of candidates, observing that seeking such information is embarrassing and humiliating for applicants.
The court observed: “The modesty and self-respect may perhaps preclude the disclosure of such personal problems like whether her menstrual period is regular or painless, the number of conceptions taken place; how many have gone full term etc.”
While the core issue in the case was not pertaining to discriminatory hiring, an ancillary issue about the same was discussed by the court. The practice of seeking personal details from female candidates is aimed at discriminating against women, with a primary objective of not incurring increased expenses related to maternity leaves.
While explicit disclosures of the status of pregnancies and menstruation are not often sought by Indian private companies, they do avoid hiring women altogether to avoid maternity leave liability, as revealed by a 2018 study. Fortunately for the cause of gender equality, the court here took a dim view of discrimination in recruitment on the basis of potential pregnancies and associated disclosures.
The petitioner approached the Kerala High Court and challenged the notification on the ground that it is discriminatory and violates her rights. The PSU contended that Section 66(1)(b) of the Factories Act, 1948 provides that no woman shall be required to work in any factory except between 6 AM and 7 PM.
However, the Kerala High Court ruled that the section is meant to protect women against exploitation by requiring them to work at night against their consent, and it could not be relied on to deny the petitioner an appointment.
The High Court also took note of the fact that the Factories Act was enacted at a time when requiring a woman to work at night was seen as exploitative and violative of her rights. However, the world has moved forward to a point where women take up demanding roles to contribute to the economy. The PSU was, therefore, directed to consider the application of the petitioner for the position of safety officer.
The court, in the instant case, refused to agree to the notion that women need to be kept away from supposedly hazardous work for their own safety, and instead has given consideration to a woman’s autonomy in choosing her professional avenues. Further, the court acknowledged here that changing times require modernisation of laws, and archaic rationales cannot be applied to solve current-day issues.
Considering that algorithmic bias in hiring is a recent trend, it is hoped that courts would take a similar stance and give due consideration to the evolving nature of social structures and economic discrimination, and suggest solutions accordingly.
How will the judiciary deal with cases of algorithmic bias in hiring?
Predicting judicial treatment of algorithmic bias in hiring is difficult due to various reasons. First and foremost, the law is silent on various issues that govern the matter. The manner in which data of applicants can be used by prospective employers is not yet governed by any legislation. The Personal Data Protection Bill, 2019 has still not been passed and is constantly subject to scrutiny.
A comprehensive data protection bill that takes note of algorithmic bias will go a long way in preventing the use of personal details like sex, marital status, number of kids and age to be used as data sets, and consequently affect hiring decisions. Further, it is not clear whether big data collected by tech companies like the Facebook family can be sold to prospective employers to get access to personal details like menstrual cycles, productivity, health markers etc. However, it is certainly a possibility and should be anticipated by policymakers.
The judiciary, too, would benefit from a data protection law, since its decisions can then be reasoned not only by principles of common law and natural justice but actual legislative intent.
Another difficulty arises due to the paucity of cases pertaining to discriminatory hiring in white-collar jobs in the private sector being brought before courts. While laws like the Equal Remuneration Act, 1976 and the new Code on Wages, 2019 exist to prevent such practices, judicial enforcement has been scarce because proving bias in the private recruitment process is tough. With the introduction of artificial intelligence in hiring with the supposed intent of reducing bias, establishing that the recruitment process is discriminatory against women will become even more difficult.
However, there are reasons to be optimistic. Even though courts have not dealt with issues of discriminatory hiring in the private sector or algorithmic bias in hiring, their treatment of cases of discrimination in government jobs provides reason to be hopeful.
The cases discussed in this article highlight that i) courts are stern in upholding principles of equality in cases of economic discrimination and specifically, discriminatory hiring, ii) courts are prepared to give due consideration to changing social circumstances, which may include social structures or technologies while deciding cases that concern autonomy of women and iii) constitutional principles, including equality, are constantly upheld and provide guidance to courts in deciding cases of gender discrimination. Having said that, dealing with complex and novel legal issues of algorithmic bias is certainly going to prove challenging for the higher judiciary in the absence of meaningful precedent.
(Saksham Malik is a Delhi-based lawyer and consultant working in the areas of competition law, technology laws, and human rights laws. The views expressed are personal.)