In light of Facebook's enormous reach and social impact, the law must hold the accountable for displaying infringing content, writes JITHENDRA PALEPU.
———
A Wall Street Journal report published in August this year had accused Facebook of political bias in India. The report stated that Facebook India's public policy head had intervened multiple times concerning the banning of several Bhartiya Janata Party leaders for posting 'hate speech' on the platform.
Following these allegations, in September, Ajit Mohan, Managing Director Facebook, India, had deposed before the Parliamentary Standing Committee on Information Technology.
Additionally, Mohan was also summoned by the peace and harmony committee of the Delhi Government. The committee had expressed suspicion over Facebook's probable role in the 'orchestration' of Delhi riots earlier this year.
Needless to say, the objective and purpose of Facebook from when it first began have been gradually altered by political and corporate forces. The online social media platform is now being weaponised to spread hate speech and circulate fake news.
“Facebook has conveniently gone back and forth on whether it's a platform or a publisher. It is known to be flippant on its stance.
These allegations are not confined to India but have surfaced in other countries as well.
For instance, in March 2018, United Nations [UN] investigators in Myanmar had claimed Facebook's involvement in inciting violence and hatred against Muslim minority groups in the country.
UN Myanmar investigator Yanghee Lee said that the platform had "turned into a beast". The UN came out very hard on Facebook for its inaction on the circulation of hate speech on its platform which purportedly led to large-scale attacks on the Rohingya Muslims.
In light of repeated allegations such as these, it is essential to delve deeper into how Facebook has operated itself overtime, how effective have been the online hate speech laws especially in India, and how Facebook's community standards are failing to regulate hate speech content.
Since its inception, Facebook has maintained itself as an online platform. But during a court proceeding in the United States, the social media platform claimed to be a publisher and suggested that it could make editorial decisions.
Facebook has conveniently gone back and forth on whether it's a platform or a publisher. It is known to be flippant on its stance.
Recent events have shown that Facebook is conveniently playing with words to evade liability in different jurisdictions.
For instance, in the United States, Facebook enjoys certain liberty if it is viewed as a publisher or a media company. Therefore, securing greater protection under the garb of free speech laws of the US. On the contrary, it has resisted being called a publisher or a media company elsewhere, but claims to be a platform in countries like Germany and Australia owing to stringent restrictions.
In India, Facebook is commonly understood and seen as a platform or a forum and cannot claim to be a publisher since it does not take any editorial decisions whatsoever. The Press and Registration of Books Act 1867 define the word editor as "the person who controls the selection of the matter that is published in newspaper." A plain reading of this definition would suggest that Facebook does not take any editorial decisions.
The social media giant has the potential to impact elections and public order. Facebook has banned users for their activity on the platform which it has deemed to be unfit or infringing. This decision-making function of Facebook flows from the decision making power of an editor of a publishing house.
Therefore, whether or not we like it, Facebook has ceased to be a social media platform and is on its way to becoming an editorial activist.
Online hate speech in India is governed by the Information Technology [IT] Act 2000. Section 69A of the IT Act enables the central government to order the blockage of public access to information available via computer resources. The central government can order such blockage if it is satisfied that it is necessary to do so.
“Section 79 of the IT act provides that the intermediary shall act as a mere platform and not a speaker. Needless to say, Facebook takes shelter under this law.
As of now, there is nothing in the Indian legal framework that pertains to terminology such as Forum, Platform, etc. Currently, social media platforms fall under the definition of 'Intermediaries' in the IT Act 2000. However, the Personal Data Protection Bill 2019 seeks to bring in a new definition for these platforms under the category of 'social media intermediaries'.
Currently, in India the legal, political, and societal factors are all acting in favour of Facebook's interests. As a platform or a forum, it is not responsible for content creation. It is involved in the transmission and circulation of the content.
Section 79 of the IT Act 2000 exempts intermediaries for the content by any third party. Section 79 was amended in 2008 and now it exempts a wide range of intermediaries from any liability for any sort of content by any third parties even though such content is in direct violation of other Indian Laws.
Section 79 of the IT act provides that the intermediary shall act as a mere platform and not a speaker. Needless to say, Facebook takes shelter under this law.
If at all intermediaries seek protection under Section 79, they have to succumb to certain obligations contained in the said section.
Thus, to procure the protection the intermediary shall merely provide access to communication systems and shall function as a platform but not as a speaker. It is provided that the intermediary should not initiate a transmission, and should not select the receivers of the transmission.
Additionally, it is required that the intermediary should not select or modify the information contained in the transmission and 'observes due diligence'.
“But what happens when that government's leaders are themselves the perpetrators of such infringing content that constitutes hate speech?
This 'observation' of 'due diligence' include- the publication of rules and policies and user agreements; the obligation to refrain from knowingly hosting, publishing, and transmitting infringing information upon receiving actual knowledge of it; and the obligation to take down infringing information upon receiving actual knowledge of it.
As part of 'due diligence' compliance, Facebook has come up with 'community standards' which are the ideal standards to be followed while users are on its platform.
These community standards are self-regulatory mechanisms that Facebook designed to curtail hate speech.
Within the community standard guidelines, under the subhead of 'dangerous individuals and organisations', Facebook specifies the kind of content it can remove based on five grounds namely:
Terrorist activity
Organised hate
Mass murder (including attempts) or multiple murder
Human trafficking
Organised violence
Criminal activity
The fact that there is such a self-regulatory mechanism designed by the social media platform proves that the social media platforms would have the actual knowledge of the infringing or hateful content. This implies that Facebook has knowledge of such posts and has remained silent on it.
The existing legal framework in India is not enough to tackle the cases of online hate speech. At the same time, there is a dire need to control online hate speech on platforms such as Facebook and Whatsapp given their massive user base in India.
“It is necessary to acknowledge that Facebook can no longer be bracketed as a social media platform.
The existing laws in India require the government to notify the intermediary about the infringing content, so the intermediary can take action. But what happens when that government's leaders are themselves the perpetrators of such infringing content that constitutes hate speech? This situation is bound to put the government in a tough spot and thus, in such instances the government tends to avoid notifying the intermediary. What should be the recourse in such a case?
In Shreya Singhal vs Union of India, 2015, the Supreme Court of India held that the intermediary is only obligated to take down the content upon the receipt of actual knowledge by a court order or by the appropriate government or its agency. But not based on user complaints.
The situation is very dynamic in India, and thus, calls for an alternate system or mechanism wherein complaints raised by an individual user are also welcomed. In which case, the ultimate decision would be with Facebook's overseeing board which could ensure quick removal of hate speech content.
In the ongoing case of Sudarshan TV, Justice K.M. Joseph opined that "media can't fall foul of standards prescribed by themselves."
If the intermediary upon having the actual knowledge about the infringing or hateful content has not taken any action, then the intermediary will lose the immunity provided by Section 79 and will be open to all kinds of prosecutions under various laws.
It is necessary to acknowledge that Facebook can no longer be bracketed as a social media platform. Therefore, a mechanism of strict laws will go a long way to help curb hate speech online.
Germany, for instance, has imposed an obligation on Facebook and other social media platforms to report any sort of criminal or infringing content directly to the federal police. In case of a default, social media platforms have to bear the brunt of heavy fines are imposed as penalties.
Encouraging hate speech in the guise of free speech is directly violating the ideals of the Indian Constitution.
Legislatures all across the globe have been very responsive and reactive to online hate speech. It's time that the Indian state gears up to combat hate speech. Given the massive user base that Facebook has in India, it is essential for the platform to realise its moral responsibility and act on it.
(Jithendra Palepu is a student of Symbiosis Law School, Pune. Views are personal.)