Deploying DigiYatra in the current legal landscape, where no comprehensive data protection law exists, would constitute a risk of violation of users' fundamental rights. As long as there are no concrete limitations protecting users' rights, it may strengthen the already substantial surveillance capacity of the State.
—-
AS the COVID-19 pandemic subsides, the world is back to travelling, and there has been a surge in air travel across the world. Recently, several Indian airports witnessed massive overcrowding, which compelled the Union Ministry of Civil Aviation (MCA) to explore options to mitigate the problem.
The Facial Recognition Technology (FRT)-based DigiYatra platform is proposed as a potential solution to overcrowding, which promises 'hassle-free' check-in at airports. However, it has the potential to violate fundamental rights, and it may be another tool of mass surveillance in the current legal landscape.
DigiYatra is a free mobile application that can be downloaded by passengers. Once downloaded, this application generates a unique DigiYatra ID upon linking the user's Unique Identification number (Aadhar) and a selfie. Once generated, the passenger can submit it while buying a ticket, upon which the concerned airport will receive the passenger's data.
To check in at the airport, instead of being part of a queue, the passenger can scan the QR code at the e-gate. Later, the FRT technology will do the match, and on successful authentication, the passenger will be granted access to proceed. In essence, DigiYatra seeks to replace the presence of all such staff who verify the identities of passengers.
DigiYatra may lead to a gross violation of the fundamental right to privacy. It is based on FRT, which means it cannot work without data that can be used to identify a person. Once the facial features of a person are recorded in the system, there is no way to know who would have access to such data. It is said that the collected data would be deleted within 24 hours after the departure of the passenger, but who would check whether the data had been deleted or not?
“Who would be liable in the case of a data breach involving DigiYatra? Will the Union Government be accountable?
DigiYatra may use Aadhar data for identity verification, and given that Aadhar metadata could have been used by the government for surveillance, the problems will multiply exponentially with the inclusion of facial data.
It wouldn't be surprising if the collected data is sold or is utilised for some other purpose without the consent of the data holder. Data holds significant value in the market, and a data breach or sale has obvious repercussions.
India's experience with data security has not been good, as there have been massive data breaches in the past few years, such as those involving COVID certificates and banking details, for which no one has been held accountable.
Who would be liable in the case of a data breach involving DigiYatra? Will the Union Government be accountable? The MCA has already stated that the DigiYatra project doesn't come under the purview of the Right to Information Act, since it is managed by the Digi Yatra Foundation, which is a not-for-profit company under Section 8 of the Companies Act, 2013.
DigiYatra has promising potential, but the risks for a user are great. India does not currently have a data protection law in force. The latest personal data protection bill floated by the Union Government, the Digital Personal Data Protection Bill, 2022 (DPDP), arbitrarily grants the Union Government sweeping powers to gather and use digital personal data of an individual without their consent in "public interest" (Clause 18).
The DPDP also provides that any data can be utilised for vague purposes like "public interest" and "for any fair and reasonable purpose" without the consent of the person to whom the data belongs, as people would be "deemed" to have given their consent for the same (Clause 8). If the Bill is enacted, this provision would make it easy and "hassle-free" for the Digi Yatra Foundation to send data to government entities, which can then be used for purposes other than flight check-ins.
Adding to the difficulties, according to the provisions of the Information Technology Act, 2000 and its rules, government agencies may collect information without obtaining consent from the source of the information in order to further specific objectives.
The NITI Aayog's discussion paper on responsible Artificial Intelligence published in November last year provides that DigiYatra was developed by a private entity (referred to as a "vendor" or "developer"), which would be later deployed by an entity, either public or private. Both the developer and the deployer will function independently, and each of them will be accompanied by ethics and experts' committees, with independent audits to be conducted periodically that would work as a safeguard. The paper also mentions a Grievance Redressal Committee that would look after the grievances of the people in case of any issue. However, without any effective legislation in place, any such mechanisms may not serve the stated purpose.
“There must be proper legislation that would cover all aspects of FRT technology, from the definition of each term relating to FRT and DigiYatra to fixing the liability of an individual or an authority.
The question is, who would be held accountable in the event of a complaint, and what would be the remedy and how independent would the decisions be?
Only when a new technology is used do people start to face problems, and DigiYatra is no different. It has been argued that the government already has everyone's pictures and details in databases relating to Aadhar and Passport, so users should not worry about privacy risks. This is a specious argument because it would imply allowing the government to completely take over individual autonomy, which it already does in part; Aadhar, too, began at some point because people believed the government would do them good through it, and the rest is history.
There must be proper legislation that would cover all aspects of FRT technology, from the definition of each term relating to FRT and DigiYatra to fixing the liability of an individual or an authority. The Digi Yatra Foundation must constitute an independent body, and the work of the same must be transparent, with everything available in the public domain and amenable to the right to information — including grievances against instances of breach of data.
The other problem with DigiYatra is that it could compromise people's privacy, even though the collected information is said to be deleted within 24 hours of the flight leaving. It should include a clear explanation of how the data will be deleted and how the passengers will know that their data has been removed from the central ecosystem. It should also have a strict compensation system in place in the event that data is retained without permission.
FRT in itself is prone to discrimination and bias, and any clarification on this point has not been released so as to assure that DigiYatra would ensure that nothing of the sort exists. FRT is at a nascent stage and is encircled with various controversies and operational glitches, even in the most technologically advanced countries.
More generally, DigiYatra must be deployed responsibly, and all the design and rights-based risks should be mitigated, and principles established for data breaches and other-purpose uses.
DigiYatra could be the first step towards further strengthening the existing State surveillance system, as long as there are no clear rules to protect the rights of users. Before it is put into place, there are a lot of things to test and look at. It can directly attack individuality, which is the core principle of a democratic society.