Representative Image Only

IT Amendment 2023: Now government will fact-check citizens online

Kunal Kamra, a stand-up comedian, has challenged the new amendment, which directs social media intermediaries to essentially censor or otherwise modify their content at the direction of the government, before the Bombay High Court on the ground that the purpose of political satire would be defeated if it were to be scrutinised by the government and censored as fake, false or misleading.

What are the new IT Amendment Rules? 

ON April 6, the Union Ministry of Electronics and Information Technology notified the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Amendment Rules, 2023, pursuant to which the Union government will establish a fact-checking unit for identifying ‘fake’, ‘false’ or ‘misleading’ content with respect to ‘any business’ of the Union government.

The fact-checking unit is likely to have four members: a representative from the Union Ministry of Information and Broadcasting, one from the Union Ministry of Statistics and Programme Implementation, a media expert and a legal expert. 

According to Electronics and Information Technology Minister Rajeev Chandrasekhar, the intermediaries may ignore the mandate of the fact-checking unit. If they decide to do so, a legal remedy can be pursued against them by the government.

The Union government will establish a fact-checking unit for identifying ‘fake’, ‘false’ or ‘misleading’ content in respect of ‘any business’ of the government.

A petition has been filed by stand-up comedian Kunal Kamra before the Bombay High Court challenging this amendment. 

On March 11, the high court’s division Bench of Justices G.S. Patel and Neela Gokhale directed the Union government to file a reply and disclose the factual background that necessitated the amendment. It has asked the government to submit its response on why the amendment should not be stayed by the court.

The matter is now listed for hearing on April 21.

Also read: Over 29 thousand online pages blocked by the Union IT Ministry in the last five years

What does the petition contend?

Kamra’s petition challenges the amendment on the grounds of Articles 14, 19(1)(a) and 19(1)(g) of the Constitution of India. 

It claims that the amendment is ultra vires Section 79 (exemption from liability of intermediary in certain cases) of the Information Technology Act, 2000 (IT Act), as it deprives intermediaries of the safe harbour immunity under Section 79 on grounds beyond Article 19(2) of the Constitution. 

Intermediaries are online services that offer a neutral platform through which persons may interact with each other over the internet.

It has also been contended that the amendment militates against the directions of the Supreme Court in Shreya Singhal versus Union of India (2015).

Essentially, the amendment requires social media intermediaries to censor or otherwise modify content relating to the Union government, if the government-mandated fact-checking body directs them to do so. 

The petition states that the impugned amendment is manifestly arbitrary as it entails the Union government “acting as a judge and prosecutor in its own cause”, violating one of the fundamental principles of natural justice. 

There is potential that this amendment may create a situation where materials critical of the government will be particularly vulnerable to being flagged as “misleading” by the government-sanctioned fact-check unit. 

There is potential that this amendment may create a situation where materials critical of the government will be particularly vulnerable to being flagged as “misleading” by the government-sanctioned fact-check unit. Moreover, attributing the Union government the privilege of being the only watchdog of fake, false, or misleading information goes against the rule of law, as per the petition. 

The amendment does not afford the user an opportunity to be heard before flagging the content as “fake, false, or misleading”. No safeguard has been adopted against the exercise of purely subjective discretion by the executive. 

Moreover, the amendment is vague and constitutes an unreasonable restriction on freedom of speech and expression under Article 19(1)(a) of the Constitution by making the “State the sole arbiter of truth or falsity of speech”, claims the petition. 

The amendment uses the term “in respect of any business of the Central Government”, which is too broad and vague, argues the petition. It has already been observed in Shreya Singhal that restrictions on free speech on the grounds of over-breadth or vagueness are unconstitutional, it underscores. 

The amendment would have a chilling effect on the freedom of speech and expression, the petition contends. It further contends that the amendment is not a reasonable restriction as it does not fall within any of the eight enumerated restrictions under Article 19(2) of the Constitution. 

It can also be deemed, according to the petition, that the amendment fails the test of proportionality, which requires that the “least restrictive” alternative be chosen. 

These are unreasonable restrictions which constitute a violation of the petitioner’s fundamental right to practice a trade or profession under Article 19(1)(g). 

The petition points out that Kamra, as a political satirist, necessarily engages in commentary about the actions of the Union government. He relies on the wide reach of the internet, through social media platforms, to share his work. The amendment could potentially lead to his content being arbitrarily blocked, taken down, or his social media accounts being suspended or deactivated, thereby causing irreparable professional harm to him, thus causing unreasonable and excessive curtailment of his constitutional rights. 

Also read: ‘Political satire versus State: What content will survive the internet?’ discussed by panel comprising lawyer, independent journalist and political satirist at FoE Con

It would also compel one to self-censor or restrict one’s own engagement with political commentary out of fear that government action may be taken, the petition warns. 

What is the background of this amendment? 

The proposed amendment modifies Rule 3(1)(b)(v) of the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (IT Rules, 2021).

As per the amendment, intermediaries are directed to make “reasonable efforts” to cause their user, through rules, regulations, and other policies, to not “host, display, upload, modify, publish, transmit, store, update, or share any information” which is “identified as fake or false or misleading by a fact check unit of the central government” in respect of “any business of the Central government”.

The proposed amendment thereby threatens the safe harbour immunity for intermediaries that have been provided under Section 79 of the IT Act, as outlined in Kamra’s petition. 

The safe harbour safeguard essentially exempts intermediaries from liability from any third-party information made available or hosted by them, provided they observed “due diligence” standards while performing their obligations under the IT Act. 

In order not to lose its safe harbour immunity, the intermediary will have to take down any information once it is identified as fake, make the content unavailable, or suspend or deactivate the account of the user whose content has been identified as “fake, false, or misleading”. 

According to Section 79(3)(b), the safe harbour immunity is lost if the “intermediary fails to expeditiously remove or disable access to that material on that resource without vitiating the evidence in any matter” upon receiving “actual knowledge, or on being notified by the appropriate Government or its agency that any information, data, or communication link residing in or connected to the computer resource, controlled by the intermediary is being used to commit the unlawful act.

The Constitutional validity of Section 79(3)(b) was challenged before the Supreme Court in Shreya Singhal. The court read down the provision to an extent and clarified that the reference is to an “[i]ntermediary [that] upon receiving actual knowledge from a court order or on being notified by the appropriate government or its agency that unlawful acts relatable to Article 19(2) are going to be committed then fails to expeditiously remove or disable access to such material…”

It further read down Rule 3(4) of the Information Technology (Intermediary Guidelines) Rules, 2011, which required an intermediary to act within thirty-six hours of receiving “actual knowledge” to mean knowledge communicated by way of a court order. 

In 2021, the Union government introduced IT Rules, 2021, which imposed a significant set of obligations upon social media intermediaries to maintain the safe harbour immunity. At least 17 petitions challenged the IT Rules, 2021 before various high courts for being unconstitutional and anti-democratic. 

On March 9, 2022, the Supreme Court stayed these proceeding in the high courts. 

Also read: Draft amendment to the IT Rules 2021 smacks of censorship

The IT Rules, 2021, were substantively and significantly amended through the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2022

In particular, Rule 3(1)(b)(v) was amended so as to prescribe that an intermediary …shall make reasonable efforts to cause the user of its computer resource not to host, display, upload, modify, publish, transmit, store, update or share any information that…(v) deceives or misleads the addressee about the origin of the message or knowingly and intentionally communicates any misinformation or information which is patently false and untrue or misleading in nature…

The amendment reflected a significant departure from the IT Rules, 2021, which only required intermediaries to inform users of their obligation not to upload or share “patently false or misleading information” to make “reasonable efforts” to “cause” users not to be upload or share “misinformation” or “misleading information”.

The 2023 amendment has now further diluted this by inserting another qualifier to expand the obligation upon intermediaries. The amendment does not define the term ‘business’ in the context in which the fact-checking unit has been established.

The consequence of this amendment is that in order not to lose its safe harbour immunity, the intermediary will have to take down any information once it is identified as fake, make the content unavailable, or suspend or deactivate the account of the user whose content has been identified as “fake, false, or misleading”. 

Also read: Explained: Bombay High Court order partially stay new IT rules on plea by The Leaflet

What has been the reaction within civil society?

According to non-governmental organisation Internet Freedom Foundation’s (IFF) statement, the amendment will directly and negatively impact online freedom of speech and the right to receive information. 

It said, “Assigning any unit of the government such arbitrary, overboard powers to determine the authenticity of online content bypasses the principles of natural justice, thus making it an unconstitutional exercise. The notification of these amended rules cement (sic) the chilling effect on the fundamental right to speech and expression, particularly of news publishers, journalists, activists, etc. 

The IFF also pointed out that the amendment potentially bypasses the statutory process prescribed under Section 69A (power to issue directions for blocking public access to any information through any computer resource) of the IT Act, 2000. 

According to Section 69A, blocking of content can only be done by the designated officer either after complying with the Information Technology (Procedure and Safeguards for Blocking for Access of Information by Public) Rules, 2009 or pursuant to an order passed by a competent court.

In Shreya Singhal, the court said that the intermediary applying its own mind to whether information should or should not be blocked is noticeably absent in Section 69A read with the 2009 Rules.

‘While on paper a consultation process was carried out, the initial proposal of its amendment alongside unrelated amendments, and a subsequent deadline day extension till February 20 rendered it deficient and inadequate.’

The Leaflet spoke to Prateek Waghre, Policy Director at the IFF, to find out if any consultation process took place before notifying the amendment. 

Waghre stated, “The consultation process that preceded the notification was inadequate. These amendments were proposed on January 17, which was the last day of the ongoing online gaming intermediary consultation.

“Since the changes to Rule 3(1)(b)(v) bear little relation to online gaming intermediaries, the pool of stakeholders who would be impacted by these amendments to the IT Rules has also been significantly altered.”

Waghre pointed out that the initial consultation period was till January 25. He said, “Facing backlash, at the time, the Ministry stated that separate consultations would be held with stakeholders, alongside a deadline extension on the final day. However, in its statement dated April 7, the Editors Guild of India (EGI) noted that the amendment was notified without ‘any meaningful consultation’.

While on paper a consultation process was carried out, the initial proposal of its amendment alongside unrelated amendments, and a subsequent deadline day extension till February 20 rendered it deficient and inadequate”, Waghre revealed. 

Also read: Twitter loses immunity as an intermediary for violating the new IT rules: Union Government to Delhi HC

The EGI, in a statement, urged the Union government to withdraw the notification as it does not mention any governing mechanism, judicial oversight, a right to appeal, or adherence to the guidelines laid down by the Supreme Court in Shreya Singhal with respect to taking down of content or blocking of social media handles. 

“All this goes against the principles of natural justice, and [is] akin to censorship,” the EGI noted. 

The Indian Newspaper Society, an umbrella body representing newspapers, journals, periodicals and magazines from across the country, has also written to the Union government, expressing its concerns and asking for the withdrawal of the notification of the amendment. 

It has been suggested that a consultation process with media organisations and press bodies must be held. 

Waghre also pointed out that while the initial draft specifically mentioned the Press Information Bureau (PIB) as the fact-checking unit, the notified amendment does not refer to it directly and states that a unit will be designated for fact-checking. 

On the PIB’s fact-checking mandate, Waghre clarified that the PIB’s current fact-checking role is advisory in nature.

On how the amendment affects the safe harbour immunity, Waghre told The Leaflet that adhering to the fact-checks issued by the designated unit will become a due diligence requirement. 

He said, “Thus, in effect, a fact-check serves as a de-facto takedown order minus the processes associated with one, which already lacks adequate transparency. The ambiguity of terms such as ‘fake’, ‘false’ or ‘misleading’ coupled with the conflict of interest with having a designated unit appointed by the executive deciding what about the Union government is true or false, makes these amendments highly likely to be misused.

Further, it is probable that to avoid liability, a range of intermediaries will mechanically comply irrespective of the quality and accuracy of the fact checks”, Waghre stated.

The IFF has flagged deep concerns that the amendment goes beyond its parent legislation, the IT Act.

Waghre further clarified this point, telling, “The amendment impermissibly restricts the safe harbour guaranteed to intermediaries by predicting the safe harbour on the takedown of content identified by this fact check unit as fake, false or misleading. 

The intermediary will not lose safe harbour the moment content is declared as fake, but only if the intermediary does not remove such content when ordered by the Ministry. Nevertheless, this goes against the parent act which does not include ‘fake or false or misleading’ as grounds under which online content may be ordered to be taken down.”