Representative Image Only

Why India needs a robust content deletion procedure to repress revenge pornography

Indian laws for the prevention and deletion of revenge pornography fall short of the mark, despite the implementation of the new Information Technology Rules, 2021 and the amendment thereof. The author makes a case for the need to reform the procedure by analysing the existing framework and encouraging proactive content deletion rather than bolting the stable after the horse has fled. 

IN 2020, a 16-year-old girl in Gujarat committed suicide after her intimate video was leaked online. In another incident two years later, the police arrested a man in Tamil Nadu for posting nude images of a girl online. 

Though varying in time and space, three aspects connect these stories. First, the images were posted or leaked by the partners of these women; second, the images were taken with the consent of the women when they were in a relationship with these men; and third, they were leaked with the motive of seeking revenge.

Such incidents often constitute the offence of revenge pornography, which is a sub-type of non-consensual intimate image sharing (NCII). It is a term used to describe, “an intimate image or video that is initially shared within the context of a private relationship but is later publicly disclosed, usually on the internet, without the consent of the individual featured in the explicit graphic.

What the government and courts have opined in the last two years

In the last two years, there have been concerted efforts by the government and the judiciary to provide an efficient procedure to tackle the issue of revenge pornography. On February 25, 2021, the Ministry of Electronics and Information Technology passed the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (IT Rules, 2021) under the powers conferred to it by the Information Technology(IT)  Act, 2000

The IT Rules, 2021 may be an achievement in terms of simplicity and promptness, but they have failed to address the aspects of accessibility, accountability and transparency.

The Rules essentially lay down the due diligence requirements to be complied with by the intermediaries and the procedure to be followed by a victim of revenge pornography who wishes to get any content deleted from any social media. The IT Rules, 2021 have further been amended by the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Amendment Rules, 2022 (2022 Amendment Rules).

The procedure laid down by the Rules has two noticeable features. First, it is simple to understand and follow. The Rules clearly identify the person to be contacted in case of a crime by mandating intermediaries to appoint a ‘grievance officer’ in India. 

Further, Rule 3(2)(a) of the IT Rules, 2021 also requires an intermediary to publish the name of the grievance officer, their contact details as well as the procedure by which the complaint can be made to such grievance officer.

Platforms like Instagram, Facebook and Google have complied with this Rule by providing a web form on their website through which a complaint can be made. These forms are usually in bullet form and the victim of revenge pornography is likely to tick options such as “I want to report content showing me in nudity/partial nudity or in a sexual act.”

Also read: Cases of crime against women: Judiciary should be progressive, not regressive

Second, the Rules provide a timeframe to address the complaint. The timeframe performs a dual function; one, it does not leave the victim at the mercy of the intermediary in anticipation of a response; and, two, it is a step towards ensuring accountability of the grievance officer in dealing with complaints. The broad timeframe provided under the Rules is:

Acknowledgement of complaint by the grievance officer 24 hours
Removal of content if a prima facie case has been made 24 hours
Resolution of the issue As expeditiously as possible and to be resolved within 72 hours.

Besides the governmental efforts, courts in India have been involved in ensuring the implementation of these Rules. A recent noticeable example are the directions and recommendations passed by the Delhi High Court in Mrs. X versus Union of India and Ors., in April 2023.

In that case, the court, inter alia, ordered sensitisation of the grievance officers, liberal interpretation of NCII to include sexual content obtained without consent and in violation of individual’s privacy, provision of a status tracker for the complainant on the online cybercrime reporting portal, prominent display of the reporting mechanism and strict compliance with the timeframe under the Rules. The court also suggested creation of a fully-functional helpline to report NCII content.

Need for reform

It is trite to state that the implementation of a legislation rests on the foundation of a strong legal framework. A peculiar aspect in the case of crimes like revenge pornography is that images are often captured with the consent of the victim and within the confines of a private relationship and later shared without consent. 

Amendments must be introduced to the existing procedure laying down the bare minimum principles which ensure accessibility of the mechanism as well as the appropriateness of procedure.

This consensual capturing of images often puts victims at an increased risk of victim-blaming, secondary victimisation as well as societal humiliation. While the IT Rules, 2021 might be an achievement in terms of simplicity and promptness, they have failed to address the aspects of accessibility, accountability as well as transparency which are equally important for an efficient mechanism. Though the 2022 Amendment Rules have partially addressed these missing aspects of accessibility and accountability, they still fall short.

Lack of independent oversight mechanism

Under Rule 4(1)(a) of the IT Rules, 2021, a ‘chief compliance officer’, who is a “key managerial personnel or a senior employee of the intermediary” is appointed by a significant social media intermediary itself to ensure compliance with the Act and Rules. 

Further, the 2022 Amendment Rules provide for the establishment of a ‘grievance appellate committee’ (GAC) by inserting Rule 3A. Any person aggrieved by an Order of a grievance officer may prefer an appeal to the GAC. While adopting an online dispute resolution mechanism, the GAC shall endeavour to resolve appeals within 30 days. The GAC was set up in February 2023 and has been described as, “a faceless dispute resolution mechanism that makes digital platforms accountable to the digital nagriks (citizens).”

Though criminalisation of revenge porn has been extensively dealt with in other jurisdictions as well, New Zealand’s Harmful Digital Communications Act, 2015 (HDCA, 2015) serves as a prototype for any law aiming to deal with the procedural aspects concerning content deletion. This law is not only simple and clear, but also incorporates provisions to ensure accountability of the stakeholders and to appropriateness of the procedure.

If one were to compare the aspects concerning the oversight mechanism under the Indian law with HDCA some significant differences are evident. First, under the New Zealand Act, the approved agency which performs the functions of receiving, accessing and investigating the complaints concerning harmful digital communications can be any person, organisation or department. 

A web form which requires technical as well as legal knowledge as a prerequisite is likely to fail since it does not equally allow all the segments of the population to contact the intermediary.

The approved agency also maintains a relationship with the online host to achieve the purpose of the Act. Second, the approved agency can also lodge a complaint with an online host for specific content on behalf of the complainant and also assist the complainant with the resolution of the complaint. 

Also read: Are the new IT Rules the next sedition law?

The significance of having an independent organisation to deal with the complaints or to ensure compliance with legal norms by the intermediaries cannot be overemphasised. A more stringent accountability regime is a prerequisite for an efficient system. 

Further, the provision of an appellate authority, though significant, is not sufficient given the fact that in crimes like revenge pornography, continuing supervision is more efficient. In case of refusal by a grievance officer to delete the content, the requirement of a 30 day-period for the disposal of an appeal as provided by the Rules can have a detrimental impact on the rights and interests of the victims given the speed of transfer of data on the web. 

This issue of speed of transferability of data has also been taken into consideration by the Orissa High Court in Subhranshu Rout alias Gugul versus State of Odisha, while dealing with a complaint of revenge pornography. The court made an interesting comparison, “[I]nformation in the public domain is like toothpaste, once it is out of the tube one can’t get it back in and once the information is in the public domain it will never go away.

Ambiguity in defining ‘accessibility’ and ‘appropriateness’ 

Rule 4(6) of the IT Rules, 2021 provides that a significant social media intermediary needs to implement an appropriate mechanism for the receipt of a complaint which shall enable the complainant to track the status of the complaint.

Similarly, the 2022 Amendment Rules provide that the “intermediary shall take all reasonable measures to ensure accessibility of its users with reasonable expectation of due diligence, privacy and transparency.” However, the Rules fail to define both appropriateness and accountability, which introduces a notion of subjectivity in deciphering these terms.

Under the existing mechanism, for instance, Google, in its web form, requires the complainant to “provide the uniform resource locators (URLs), explain in detail why the content on these URLs is unlawful by citing specific provisions of the law, quote the exact content from each URL, describe the picture/video which is unlawful, provide instructions on how to locate content on the URL and steps to reproduce the violation.”

Upon cursory perusal, it is evident that these requirements mandate not only a complainant to possess technical and digital expertise but also legal knowledge. 

The provision of an appellate authority [GAC], though significant, is not sufficient given the fact that in crimes like revenge pornography, continuing supervision is more efficient.

The urban–rural divide and gender gap significantly affect digital literacy in India. For instance, only 8.5 percent of rural women and 30.1 percent of urban women were able to use the internet as of 2019. Reports suggest that India has the widest gender gap in digital usage in the Asia-Pacific region.

A crime like revenge pornography, which can affect the fundamental rights of an individual, cannot be circumscribed by technicalities. To curb this subjectivity, it is pertinent that the terms should be well explained as the take-up of procedural rights by victims is dependent on the accessibility of the mechanism and the trust that victims repose in the system. 

Also read: Right to be forgotten: A case of protecting human dignity and informational self-determination

It is important to note that it is not entirely unconventional for the government to amend the Rules to increase accessibility as is evident from the amendment of Rule 3(1)(a) of the IT Rules, 2021 made in 2022. 

This Rule requires the intermediary to publish on its website, mobile applications or both, the Rules and regulations, privacy policy as well as user agreement. The 2022 Amendment Rules clarified that the same is to be done in English or any other language specified in the Eighth Schedule to the Indian Constitution. 

Allowing publication of the policies and agreements in vernacular languages increases the accessibility of the system as well as user awareness. Thus, amendments must be introduced to the existing procedure laying down the bare minimum principles which ensure accessibility of the mechanism as well as the appropriateness of the procedure keeping in view the composition of all the segments of the population.

Also read: IT Amendment 2023: Now government will fact-check citizens online

Herein, again a stark contrast from Section 25(2) of New Zealand’s HDCA, 2015 is evident. The Section provides that the protection from any criminal or civil liability for the content posted, conferred on an online content host, is inapplicable if the host does not provide an easily accessible mechanism that “enables a user to contact the host about specific content in the manner provided in that Section.” 

This provision essentially highlights the larger aim by which one can test accessibility, i.e., whether the means adopted by an intermediary actually achieve the aim of enabling an individual to contact the intermediary. Incorporating such a provision allows for a liberal interpretation as to what denotes accessibility instead of a water-tight list of the components.

Proactive content deletion denotes a shift towards a victim-centric approach wherein the humiliation, trauma and secondary victimisation that is faced by the victim due to resurfacing of the same data is reduced to an extent.

Thus, when evaluating these parameters, a web form which requires technical as well as legal knowledge as a prerequisite is likely to fail since it does not equally allow all the segments of the population to contact the intermediary.

Shift towards proactive content deletion required

Proactive content deletion (PCD) denotes the efforts which involve the use of technology to help when a victim has a suspicion that they might be a victim of revenge pornography as well as to curb the multiplicity of the links relating to the same incident. 

StopNCII.org is one such initiative which has been launched by the UK’s revenge pornography helpline with industry partners including Facebook and Instagram. It uses an algorithm where a unique hash value is assigned to each image or video. This hash is then shared with the participating companies who block the content on their own.

The  Delhi High Court in the Mrs. X case has suggested incorporating PCD in a recent judgment as well. The court also recommended a similar approach wherein a unique hash value is assigned to prevent resurfacing of data. They went on to observe that “the search engine cannot insist on requiring the specific URLs from the victim for the content that has already been ordered to take down.

As a long-term suggestion, the court also suggested that a trusted third-party encrypted platform be developed by the government for registering the link of NCII content so that a cryptographic hash may be assigned and the content is automatically identified and removed.

Proactive content deletion denotes a shift towards a victim-centric approach wherein the humiliation, trauma and secondary victimisation that is faced by the victim due to resurfacing of the same data is reduced to an extent. Doing so helps preserve the dignity of the victim and ensures that the victim is not repeatedly deprived of their fundamental rights.

Two overarching requirements

Two common points connect both post-facto content deletion which is currently being followed in India. First, the need for awareness and empowerment of the victim. Awareness and empowerment ensure that the victim understands their procedural rights. They also determine the ability of a victim to decide and choose an alternative to the traditional police-based recourse by approaching the intermediary as well as the stage at which the victim decides to seek help, i.e., before the occurrence or after the occurrence of the crime. 

Second, the sensitivity and trustworthiness of the intermediary and other officials. These aid in the formation of a judgement in the mind of the victim regarding the attitude of officials involved in tackling the issue and whether any helpful response is likely. Most importantly, they also determine whether the victim reports the crime or decides to deal with the issue at an individual level.

At last, a golden thread connecting these two points is the need to stop the victim-blaming in cases of revenge pornography, which not only leads to lack of reporting of these crimes but also reduces the accountability of the officials concerned.