Data about the adoption of FRTs by different agencies in India without any effective safeguards is alarming from the point of view of an individual's right to privacy and anonymity.
—–
ON May 19, 2021, riding through the lanes of Shahran, Hyderabad, S.Q. Masood was heading home with his father-in-law when police officials stopped him. The pair was asked to step aside and remove their masks, despite being amidst the brutal second wave of the COVID-19 pandemic, so that the police officials could take their pictures. Masood refused to remove his mask but his picture was taken regardless. Many riders were pulled over for the same purpose of clicking their pictures. When asked why they were doing this, the police murmured amongst themselves and took a picture of his vehicle's licence plate number, without answering him.
That day, Masood received no ticket or communication regarding the reason for him being pulled over. Masood filed applications under the Right to Information Act, and sent a legal notice to the Police Commissioner of Hyderabad enquiring about the reason for these pictures, the law empowering the police to take these pictures, the method of storage of these pictures and access to it, and the safeguards and accountability mechanisms in place, but to no avail. Masood received no explanation.
"Being Muslim and having worked with minority groups that are frequently targeted by the police, I am concerned that my photo could be matched wrongly and that I could be harassed," Masood, told the Thomson Reuters Foundation. Further looking into this, Masood found out that the Telangana government had been deploying Facial Recognition Technology [FRT] as a pilot exercise for India.
“Telangana is "the most surveilled place in the world", according to research conducted by Amnesty International.
Last year, the High Court of Telangana issued a notice to the state government following a public interest litigation [PIL] filed by Masood which contested the use of FRT as an invasive form of surveillance in the absence of an enabling law. In response to Masood's PIL, a bench of Chief Justice Satish Chandra Sharma and Justice Abhinand Kumar Shavili sought the Telangana government and police's responses, and set the matter for further hearing. The government was notified about the same but has not responded yet.
As per digital rights advocacy organisation Internet Freedom Foundation, Telangana has eight out of the 113 FRT systems available in India – making it the state that uses the third highest number of FRTs among all Indian states and union territories, and Hyderabad ranking as the 12th most surveilled city in the world in 2021. The entire state is "the most surveilled place in the world", according to research conducted by international human rights NGO Amnesty International.
The Telangana government has publicly stated that it has installed over five lakh cameras in Hyderabad, and that it is targeting to increase it to ten lakhs. Among these were FRT-enabled CCTV cameras for real-time surveillance.
The photographs captured are fed into an app called TSCOP, created for the Telangana State Police in 2018. This real-time app scans the photograph through a comprehensive data repository maintained by the Crime and Criminal Tracking Network (CCTNS), which links every police station through their databases, which contains millions of images of known and arrested offenders, and wanted and missing persons, maintained by the Union Ministry of Home Affairs. This surveillance activity is not backed by any law, empowering them to undertake wide-scale deployment of surveillance without a warrant or consent of the individual surveilled.
Masood has challenged Telangana's wide use of surveillance systems without appropriate regulation and safeguards in place. He, with the help of the Internet Freedom Foundation, filed the PIL for declaring the use of FRTs in Telangana, unconstitutional.
It was argued in court that FRT restricts privacy, as by way of biometric data, it is used to surveil the activities of individuals, when, as held by the Supreme Court in its landmark judgment in K.S. Puttaswamy vs. Union of India & Ors. (2017), the government cannot restrict the right to privacy unless such restriction is based on law, and is necessary and proportionate. In Puttaswamy, the Supreme Court held that privacy is not surrendered merely because the individual is in a public space.
“'The use of such technology without any authorisation from the law should be declared unconstitutional and illegal': advocate Manoj Reddy
"The continued use of Facial Recognition Technology by the police violates the privacy of individuals which was upheld by the Supreme Court's Aadhaar judgment. The use of such technology without any authorisation from the law should be declared unconstitutional and illegal," argued advocate Manoj Reddy, who appeared on behalf of Masood.
"Whether the use of FRT in a given instance is proportionate or not is of course contingent on various factors, such as the extent and breadth of its use, the manner in which it is implemented. For example, how it collects data, what the information collected is used for, who within law enforcement has access to such information and under what circumstances, etc.)", discusses Siddharth Sonkar, a technology lawyer and author of 'What Privacy Means: Why It Matters and How We Can Protect It'.
Also read: Conundrum of expectations: are courts prepared for challenges against facial recognition technology?
The other issue of the petition involved arguments against the use of FRT due to the Telangana police maintaining discretionary powers over its deployment. Reddy gave the example of the judgment by the Queen's Bench Division of the High Court of Justice in England and Wales, in Ed Bridges vs. South Wales Police (2020). In this case, the court prohibited the deployment of FRT because it was left to the discretion of the police on who/where to deploy FRT.
“'For lawful usage of FRT, as with any technology that has privacy implications, the government has to use it in accordance with the restrictions imposed upon it by the Constitution.': Pranesh Prakash, co-founder, Centre for Internet and Society.
Additionally, the petition contends that the deployment of FRTs is for mass surveillance without any legal basis, let alone probable or reasonable cause, infringing the fundamental rights of the residents of the state of Telangana. The government has also not put guidelines in place to prevent the arbitrary use of the technology.
The Leaflet spoke with Pranesh Prakash, co-founder of the Centre for Intent Society, a non-profit organization that undertakes interdisciplinary research on internet and digital technologies. Prakash says that, "Bulk or mass surveillance and indiscriminate usage of facial recognition technologies violate the right to privacy that is inherent in every one of us. For surveillance to be legal, it has to be targeted. FRT can help engage in targeted surveillance rather than bulk surveillance. However, FRT uses a person's face as a means of biometric identification, and hence could be highly intrusive into a person's privacy. Thus, for lawful usage of FRT, as with any technology that has privacy implications, the government has to use it in accordance with the restrictions imposed upon it by the Constitution."
He adds: "There is nothing inherent in FRT as a technology that makes it pass or not pass the tests of necessity and proportionality as laid out in the K.S. Puttaswamy judgement. Whether FRT is necessary and proportionate or not in any particular situation depend on the circumstances, how it is being used, and what safeguards exist for its use.
As an example: if a visually impaired person were to use FRT that mimicked the human ability to recognize limited numbers of faces of those people a person had encountered (and to forget faces as well), then I don't see why such use of FRT, which mirrors the ability of the average human, is not 'proportionate'."
In Masood's case, it was argued that the employment of FRT violated the right to equality established by Article 14 of the Constitution. "FRT is not 100% accurate anywhere in the world, and this lack of accuracy is also associated with bias against marginalised communities. The inherent bias in the technology violates Article 14," argued Reddy.
“Although FRT claims to improve in a controlled setting, deployment on the ground, of even the best algorithms, is clearly seen to have higher error rates due to sensitivity to external factors.
Prakash differs on this. "Let me note that the petitioner's argument on Article 14 is weak, as FRT being 'inaccurate' is not in itself violative of Article 14, nor is FRT inherently biased against marginalised communities. While a particular deployment of FRT may be biased, the technology itself cannot be said to be biased."
As observed by Hyderabad-based cybersecurity researcher Srinivas Kodali in a podcast by the India Forum, in the U.S.A., the National Institute of Standards and Technology [NIST], which researches technology systems, has been testing FRTs as well. The Institute, among others, as able to observe the unfavourable biases against minority communities in the use of such FRTs. However, in India, there is no data available on the working of the algorithm in these FRTs. It has been an opaque system with no research on the accuracy of the technology, and in order to measure the presence of any large-scale biases, we need to know the workings of these algorithms.
Data is available on a few test runs in Telangana where FRT was deployed for the purpose of voter verification during municipal elections. While they did not exclude anyone who tried to vote even after an error due to inaccurate FRT, the accuracy in those experiments was merely 60-70 per cent. If this technology were to be actively deployed around the state, 30 per cent of the population would be excluded.
"While it is true that there is a possibility that FRTs may be accurate, it is important to bear in mind that our fundamental rights, such as the right to privacy are not probabilistic. To the extent that probabilities suggest FRTs result in arbitrary or incorrect decisions inter alia due to inherent biases, it is ideal to assess beforehand whether such systems are fair, and provide sufficient information about the nature of FRT implemented, nature of information being collected, and the purposes for which it is used.
Until a facial recognition system is adequately assessed for fairness, it is difficult to rule out the possibility of bias. Decisions/determinations made by FRTs, particularly when used by the State, should be susceptible to judicial review under Article 14", opines Sonkar.
Gender Shades, a study conducted by researchers Joy Buolamwini from Massachusetts Institute of Techonology's Media Lab and Timnit Gebru from Stanford University, analysed three commercial facial recognition software, and revealed almost perfect accuracy for determining the gender of light-skinned men. However, the error rates increased to 34 per cent in the case of dark-skinned women with datasets from 2017.
While FRTs have improved dramatically lately, NIST's Facial Recognition Vendor Test found that when the subject is obscured by shadow or may not be looking at the camera, the error rate shoots up to 9.3 per cent when matching pictures with mugshots. Although FRT claims to improve in a controlled setting, deployment on the ground, of even the best algorithms, is clearly seen to have higher error rates due to sensitivity to external factors.
Telangana is not the only state using such technology. Under the implication of national security and solving crime, the deployment and development of FRTs across the states of India has been growing, regardless of the invasive nature of these technology systems and its potential violation of fundamental rights. FRT is being utilised for everything, from law enforcement to welfare distribution, and seemingly harmless activities like boarding at various airports.
“The proposed Personal Data Protection Bill does not even cover the issue of mass surveillance.
In March 2018, the Delhi Police acquired the Automated Facial Recognition System software as a tool to identify lost and found boys and girls by matching photos. However, Delhi Police used facial recognition software to screen the crowd present at Prime Minister Narendra Modi's Ramlila Maidan Rally for "habitual protestors" and "identifiable faces" from footage filmed at the city's various protest events. Additionally, Delhi police are working on arming Police Control Room vans with remote FRTs.
The Delhi police said in February last year that 137 of the 1,818 arrests made related to the Northeast Delhi riots of 2020 were through the use of FRT, with 94 accused identified with help of their driving licence photos.
Punjab Police actively uses the Punjab Artificial Intelligence System with an ever-expanding database of over 1,00,000 criminal records maintained by the state police.
The Internet Freedom Foundation's Panoptic Project tracks and analyses every FRT project deployed in India. As per Panoptic, Maharashtra has the highest number of FRT systems in place among all states and union territories in India, with 11 FRT systems in the state alone. Coming in at second is Delhi, with 10 FRT systems.
The Union Government is increasingly looking to deploy FRTs even at exam centres, with the Union Education Ministry admitting to the use of FRT by the Central Board of Secondary Education. While the Education Minister claimed in 2021, that no facial and biometric data was stored in the server, this fails to address the concerns over its efficacy.
The National Crime Records Bureaus is seeking to increase the capabilities of Automated FRTs, and wants to build the world's biggest facial recognition system. This system is expected to integrate with all the State's deployed FRTs, and would be able to recognize a person with a face mask on.
The case of S.Q. Masood vs. the State of Telangana offers only a glimpse at tip of the iceberg of digitising India at the cost of invading individual anonymity.
A Personal Data Protection Bill is still under discussion at the Parliament, even though plans for further surveillance of citizens are already being executed across India. However, the proposed Bill does not even cover the issue of mass surveillance. In fact, according to Kodali, the proposed Bill is not even a privacy bill since it primarily looks towards penalising misuse of consumer data by companies. Section 35 of the Bill empowers the government to use data at their discretion for national security reasons.
Prakash observes that in order for a constitutionally valid regulation safeguarding privacy of the citizens, there has to be a publicly-debated law that authorises the use of FRT. Such a law should lay out, among other things, the following:
In order to formulate and map out a law addressing the challenges of privacy and security, Sonkar deliberates that while identifying criminals and tracking them is often necessary for preventing and taking steps towards addressing crimes, these systems need to be introduced with proper procedural and substantive safeguards with due regard to the fundamental right to privacy, towards ensuring the right balance between two competing interests. Towards ensuring this balance, we need a legal framework that introduces accountability against arbitrary consequences arising out of the use of FRT by the State.
Similarly, there should exist specific safeguards, such as the requirement to obtain a warrant from a neutral magistrate before deciding to track someone's movements. There should exist limitations in terms of the duration for which a person can be surveilled, as well as specific controls on designations of persons who may have access to information obtained through FRT implementation.
In the U.S.A., a bill titled 'The Ethical Use of Facial Recognition Act' was introduced in early 2020. There is evidence that the technology has been used at protests and rallies, which has a chilling effect on free speech. The bill aims to forbid the federal government from carrying out extensive biometric surveillance without express statutory permission, as well as to withhold some federal public safety payments from states and local governments that conduct biometric monitoring.
Furthermore, after public outcry over the system's implementation, the Internal Revenue Service recently stated that it will no longer employ face recognition to file taxes online.
In 2020, the U.S. banned the use of Amazon's Rekognition, an FRT for the use of law enforcement, after Amazon had argued that the oven is not thrown away if the pizza gets burnt because of the wrong temperature setting, in defence of Rekognition.
The European Union, which is one of the largest CCTV markets in the world, has been debating whether or not its police should be allowed to use facial recognition in public spaces and had initially considered a five-year moratorium on its use.
In China, FRTs have been infamously used for mass surveillance systems. Under the guise of development and improvement, FRT systems were deployed as part of the social credit system and are nothing more than 'algorithmic obedience training', according to the American documentary film 'Coded Bias'. China's surveillance system captures up to 6.8 million records each day from cameras surrounding hotels, parks, tourist attractions, and mosques, according to a 2019 database leak.
“Safeguards are critical because many of these technologies still lack systems that function at extremely high accuracies, and even the best algorithms underperform in more difficult real-world scenarios.
The criminal database repository of the Facial Recognition System of the International Criminal Police Organization, commonly known as Interpol, contains facial images from over 179 countries along with a biometric software application that enables it to analyse proportions, patterns, shapes and contours of one's facial features.
The rise of deployment of biometric mass surveillance is evident. But lawmakers have a duty to create regulations through which these technology systems are built to protect the rights of its people, not further infringe them by empowering the State with arbitrary and discretionary powers, and the inclusion of the private sector for its creation. Assessing the reality of how accuracy affects these hazards will be critical for legislators trying to develop safeguards for individuals while protecting the technology's potential advantages.
These technology systems must be deployed only when it respects an individual's right to privacy and anonymity. Such safeguards are critical because many of these technologies still lack systems that function at extremely high accuracies, and even the best algorithms underperform in more difficult real-world scenarios. They must be designed in such a manner that it does not harm democracy, because the current deployment has the potential to create a society that is perpetually under mass surveillance.