Analysis

UP Uses Facial Recognition Technology to Mete Out Discriminatory Treatment

Muazzam Nasir

Facial Recognition Technology has a structural flaw. The datasets are humanly designed. If historical criminal data is fed into the algorithm, its decision-making abilities are altered. This induces a sense of preliminary bias, based on prevalent societal structures. With the Lucknow police announcing its intention to deploy FRT to pre-empt potential crimes against women, the state can evade legal checks and balances. Women are being indirectly coerced to forego their right to privacy and free speech in order to avail themselves the security of the state, write MUAZZAM NASIR AND ASHISH KUMAR.

—–

IN 2018, Amazon infamously defended the use of Rekognition, its facial recognition technology (FRT), for law-enforcement. It pleaded that FRT cannot be dismissed if its application lacks preciseness presently. It drew an analogy: The oven is not thrown away if the pizza gets burnt because of a wrong temperature setting.

But two years down the line, Amazon did, indeed, have to throw away the oven. It banned the use of Rekognition by the US police for 12 months.

India has now borrowed the oven, and the pizza is burning in its largest state – Uttar Pradesh. Recently, the Lucknow police announced its intention to deploy Artificial Intelligence-driven FRT to pre-empt potential crimes against women. The proposed intervention is part of the state government's intermittent advances towards the use of AI for veiled surveillance. But FRT is inherently flawed and lacks objective legislation. This indirectly allows the state to evade legal checks and balances. Besides, its use is prima facie violative of individual anonymity and unconstitutional.

DISCORDANT NOTES

So what is FRT? It is a form of biometric technology that uses a device with photographic capability. The technology is fed with an algorithm, which consists of an "artificial neural network" (ANN). The use of ANN is at the heart of FRT-based classification.

FRT is part of UP's intermittent advances towards the use of AI for veiled surveillance. But FRT is inherently flawed and lacks an objective legislation. This indirectly allows the state to evade legal checks and balances. Besides, its use is prima facie violative of individual anonymity and unconstitutional.

ANN is an "information processing model" that corroborates the detection of a newly-fed image from available data. For instance, a dataset of differently bred dogs is collected and fed into an ANN. The algorithm will then create a digital faceprint of the dog by measuring its facial features. These include the distance between the eyes, width of the nose, shape of the cheekbones and length of the jawline. Thereafter, if we supply a new image of a dog, the ANN will predict its breed by comparing it with the existing dataset. The algorithm makes this prediction by assessing the closeness in the geometry of different faceprints.

FRT has a delicately woven mechanism. However, it has a structural flaw. The datasets are humanly designed. If historical criminal data is fed into the algorithm, its decision-making abilities are altered. This induces a sense of preliminary bias, based on prevalent societal structures.

INDIRECT DISCRIMINATION

In America, the bias has been racial and gendered. The US Department of Justice noted: "Blacks are twice more likely to be arrested than Whites." The American Civil Liberties Union revealed that Rekognition incorrectly identified 28 US Congressmen as criminals. There was a peculiar similarity in these false matches: all of them were people of colour. Further, the FRT algorithm performed the worst on dark-skinned females. There was a 60% higher chance of accurately predicting a lighter-skinned female than her darker compatriot.

Similar parallels can be drawn with the subconsciously discriminative criminal justice system in India. The Habitual Offenders Act, 1952, pre-emptively identifies 237 castes as criminal-by-birth. Death sentences have been disproportionately meted out to religious minorities and marginalised communities. More than 70% of those condemned to death belonged to these communities.

Recently, Muslims were the largest constituent dissidents during the protests against the amendment to India's citizenship law. Consequently, the Uttar Pradesh police used FRT to identify and arrest more than 1,100 dissidents, of which the largest demographic was Muslim. The algorithm infused into FRT has societal implants of discriminatory treatment. This, per se, feeds to the notion of indirect discrimination wherein outwardly neutral policies continue to have an unabatedly negative impact on specific communities.

FRT is part of UP's intermittent advances towards the use of AI for veiled surveillance. But FRT is inherently flawed and lacks an objective legislation. This indirectly allows the state to evade legal checks and balances. Besides, its use is prima facie violative of individual anonymity and unconstitutional.

In Madhu Kanwar v. Northern Railway, the Delhi High Court affirmed that indirect discrimination is violative of prohibition on discrimination under Article 15. The Supreme Court's jurisprudence in SabarimalaJoseph Shine and Navtej Johar  extended the scope of Article 15(1) to include "institutional and systemic discrimination". Gautam Bhatia has noted that such a reading affirms the recognition of indirect discrimination in the text of the Indian constitution. Thus, the archetypical equality law in India can be violated by FRT.

LEGISLATIVE VACUUM

In the landmark Puttaswamy judgment, the court laid down the "proportionality test" for a state's privacy invasion to be valid. Primarily, the state action must be sanctioned by law. India does not have objective legislation for FRT.

The other able mechanism–Personal Data Protection Bill (PDP), 2019, is pending in Parliament. Hypothetically, if PDP was law, FRT would have violated the same. According to Section 11 of PDP, the data principal's consent is necessary at the commencement of processing of personal data.

In India, the Habitual Offenders Act, 1952, pre-emptively identifies 237 castes as criminal-by-birth. Death sentences have been disproportionately meted out to religious minorities and marginalised communities. More than 70% of those condemned to death belonged to these communities.

Further, under Section 11(2)(c), the consent has to be specific. The data principal should be able to "determine the purpose of processing". The use of FRT lacks informed consent. Even if an individual wants to resist participating in the identification process, FRT's use of cameras in public spaces would not objectively exclude them.

In 2017, Google induced "image recognition" into its AI-based app – Lens. The objective was to use augmented reality to furnish real-time information about public places like restaurants and bars. However, Google could not control the pictures and objects that people could capture and scan. For instance, a third person could capture any individual having a meal at a restaurant in the process of garnering information about it. This data traveled to Google's Cloud, thus, increasing the risk of a privacy breach. FRT is similarly built on an image-recognition mechanism. The lack of control over what gets captured and who stores it are legitimate concerns and the government is evading it by not enacting legislation.

UNCONSTITUTIONAL CONDITION 

In Puttaswamy, the court ruled that "individual anonymity" is a facet of privacy. According to it, anonymity cannot be surrendered even if the individual is in a public space. FRT can violate the anonymity of an individual in a two-fold manner. Firstly, a gaze of FRT-based cameras can "trace and identify" an individual. Secondly, its algorithm can link this information to other facets of the individual's anonymous data. This includes his visit to the doctor and FRT tracing this visit. It can then be used to link the individual's previous medical record. Therefore, the two-fold process leads to the construction of an "activity-based identity" of citizens, thus violating their individual anonymity.

The degree of invasiveness increases when monitoring is performed by the government and its entities. The creation of a panopticon by the State lends way to self-awareness or censorship. An individual might deviate from his preferred path upon seeing a surveillance camera or may not exercise his constitutional right to protest due to the fear of identification. This manufactures a social norm of controlling citizens and eventually, creates a chilling effect on free speech.

Further, the use of FRT also permeates an "unconstitutional condition". In Ahmedabad St Xavier's College v. State of Gujarat, the apex court ruled that "any stipulation imposed upon the grant of a governmental privilege which requires the recipient of the privilege to relinquish some constitutional right would be an unconstitutional condition".

In Uttar Pradesh, the use of FRT is being fronted on the grounds of "women security". Women are being indirectly coerced to forego their right to privacy and free speech in order to avail the security of the state. In the Kerala Education Bill, the court had ruled that citizens cannot be forced to make a compelling choice between a necessity and a waiver of fundamental rights. Thus, the use of FRT for women's security, at the altar of the right to privacy and free speech, would constitute an "illusion of choice and waiver".

STRUCTURAL INCONGRUITY 

FRT is another thorn in the flesh of achieving a balance in the concatenation of law and technology. The fundamental premise of FRT, its algorithm, is riddled with structural incongruity because of dominant social inequality and historical injustices in society. This has a considerable impact on the mannerism of its use in the Indian paradigm, which has been traditionally unequal.

In the landmark Puttaswamy judgment, the court ruled that "individual anonymity" is a facet of privacy. According to it, anonymity cannot be surrendered even if the individual is in a public space. FRT can violate the anonymity of an individual.

This is further exacerbated by a legislative vacuum concerning FRT and data protection in India, which allows free reign to the State in its use of the technology. The State can make in-roads into individual anonymity and perpetuate the creation of a panopticon.

John Perry Barlow's fictional conversation with technology at the 1996 Declaration of Independence of the Cyberspace said: "You have neither solicited nor received our consent. We did not invite you, nor do you know our world." Thus, technological advancements have to be confined within the four-walls of citizen consent and the use of FRT is an affront to such consent.

(Muazzam Nasir and Ashish Kumar are students at Hidayatullah National Law University, Raipur. The views expressed are personal.)