The Ministry of Electronics and Information Technology (“MeitY”) proposed amendments to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (“Intermediary Guidelines”) on January 17, 2023. The draft amendments aim to regulate online gaming, but also seek to have intermediaries “make reasonable efforts” to cause their users not to upload or share content identified as “fake” or “false” by the Press Information Bureau (“PIB”), any Union Government department or authorised agency (See proposed amendment to Rule 3(1)(b)(v).) The draft amendments in their current form raise certain concerns that we believe merit additional scrutiny.
CCG submitted comments on the proposed amendment to Rule 3(1)(b)(v), highlighting its key feedback and concerns. The comments were authored by Archit Lohani and Vasudev Devadasan and reviewed by Sachin Dhawan and Jhalak M. Kakkar. Some of the key issues raised in our comments are summarised below.
Misinformation, fake, and false, include both unlawful and lawful expression
The proposed amendment does not define the term “misinformation” or provide any guidance on how determinations that content is “fake” or “false” are arrived at. Misinformation can include various forms of content, and experts have identified up to seven subtypes of misinformation such as: imposter content; fabricated content; false connection; false context; manipulated content; misleading content; and satire or parody. Different subtypes of misinformation can cause different types of harm (or no harm at all) and are treated differently under the law. Misinformation or false information thus includes both lawful and unlawful speech (e.g., satire is constitutionally protected speech).
Within the broad ambit of misinformation, the draft amendment does not provide sufficient guidance to the PIB and government departments on what sort of expression is permissible and what should be restricted. The draft amendment effectively provides them with unfettered discretion to restrict both unlawful and lawful speech. When seeking to regulate misinformation, experts, platforms, and other countries have drawn up detailed definitions that take into consideration factors such as intention, form of sharing, virality, context, impact, public interest value, and public participation value. These definitions recognize the potential multiplicity of context, content, and propagation techniques. In the absence of clarity over what types of content may be restricted based on a clear definition of misinformation, the draft amendment will restrict both unlawful speech and constitutionally protected speech. It will thus constitute an overbroad restriction on free speech.
Restricting information solely on the ground that it is “false” is constitutionally impermissible
Article 19(2) of the Indian Constitution allows the government to place reasonable restrictions on free speech in the interest of the sovereignty, integrity, or security of India, its friendly relations with foreign States, public order, decency or morality, or contempt of court. The Supreme Court has ruled that these grounds are exhaustive and speech cannot be restricted for reasons beyond Article 19(2), including where the government seeks to block content online. Crucially, Article 19(2) does not permit the State to restrict speech on the ground that it is false. If the government were to restrict “false information that may imminently cause violence”, such a restriction would be permissible as it would relate to the ground of “public order” in Article 19(2). However, if enacted, the draft amendment would restrict online speech solely on the ground that it is declared “false” or “fake” by the Union Government. This amounts to a State restriction on speech for reasons beyond those outlined in Article 19(2), and would thus be unconstitutional. Restrictions on free speech must have a direct connection to the grounds outlined in Article 19(2) and must be a necessary and proportionate restriction on citizens’ rights.
Amendment does not adhere with the procedures set out in Section 69A of the IT Act
The Supreme Court upheld Section 69A of the IT Act in Shreya Singhal v Union of Indiainter alia because it permitted the government blocking of online content only on grounds consistent with Article 19(2) and provided important procedural safeguards, including a notice, hearing, and written order of blocking that can be challenged in court. Therefore, it is evident that the constitutionality of the government’s blocking power over is contingent on the substantive and procedural safeguards provided by Section 69A and the Information Technology (Procedure and Safeguards for Blocking for Access of Information by Public) Rules, 2009. The proposed amendment to the Intermediary Guidelines would permit the Union Government to restrict online speech in a manner that does not adhere to these safeguards. It would permit the blocking of content on grounds beyond those specified in Article 19(2), based on a unilateral determination by the Union Government, without a specific procedure for notice, hearing, or a written order.
Alternate methods to counter the spread of misinformation
Any response to misinformation on social media platforms should be based on empirical evidence on the prevalence and harms of misinformation on social media. Thus, as a first step, social media companies should be required to provide greater transparency and facilitate researcher access to data. There are alternative methods to regulate the spread of misinformation that may be more effective and preserve free expression, such as labelling or flagging misinformation. We note that there does not yet exist widespread legal and industry consensus on standards for independent fact-checking, but organisations such as the ‘International Fact-Checking Network’ (IFCN) have laid down certain principles that independent fact-checking organisations should comply with. Having platforms label content pursuant to IFCN fact checks, and even notify users when the content they have interacted with has subsequently been flagged by an IFCN fact checker would provide users with valuable informational context without requiring content removal.
The question of when intermediaries are liable, or conversely not liable, for content they host or transmit is often at the heart of regulating content on the internet. This is especially true in India, where the Government has relied almost exclusively on intermediary liability to regulate online content. With the advent of the Intermediary Guidelines 2021, and their subsequent amendment in October 2022, there has been a paradigm shift in the regulation of online intermediaries in India.
To help understand this new regulatory reality, the Centre for Communication Governance (CCG) is releasing its ‘Report on Intermediary Liability in India’ (December 2022).
This report aims to provide a comprehensive overview of the regulation of online intermediaries and their obligations with respect to unlawful content. It updates and expands on the Centre for Communication Governance’s 2015 report documenting the liability of online intermediaries to now cover the decisions in Shreya Singhal vs. Union of India and Myspace vs. Super Cassettes Industries Ltd, the Intermediary Guidelines 2021 (including the October 2022 Amendment), the E-Commerce Rules, and the IT Blocking Rules. It captures the over two decades of regulatory and judicial practice on the issue of intermediary liability since the adoption of the IT Act. The report aims to provide practitioners, lawmakers and regulators, judges, and academics with valuable insights as they embark on shaping the coming decades of intermediary liability in India.
Some key insights that emerge from the report are summarised below:
Limitations of Section 79 (‘Safe Harbour’) Approach: In the cases analysed in this report, there is little judicial consistency in the application of secondarily liability principles to intermediaries, including the obligations set out in Intermediary Guidelines 2021, and monetary damages for transmitting or hosting unlawful content are almost never imposed on intermediaries. This suggests that there are significant limitations to the regulatory impact of obligations imposed on intermediaries as pre-conditions to safe harbour.
Need for clarity on content moderation and curation: The text of Section 79(2) of the IT Act grants intermediaries safe harbour provided they act as mere conduits, not interfering with the transmission of content. There exists ambiguity over whether content moderation and curation activities would cause intermediaries to violate Section 79(2) and lose safe harbour. The Intermediary Guidelines 2021 have partially remedied this ambiguity by expressly stating that voluntary content moderation will not result in an intermediary ‘interfering’ with the transmission under Section 79(2). However, ultimately amendments to the IT Act are required to provide regulatory certainty.
Intermediary status and immunity on a case-by-case basis: An entity’s classification as an intermediary is not a status that applies across all its operations (like a ‘company’ or a ‘partnership’), but rather the function it is performing vis-à-vis the specific electronic content it is sued in connection with. Courts should determine whether an entity is an ‘intermediary’ and whether it complied with the conditions of Section 79 in relation to the content it is being sued for. Consistently making this determination at a preliminary stage of litigation would greatly further the efficacy of Section 79’s safe harbour approach.
Concerns over GACs: While the October 2022 Amendment stipulates that two members of every GAC shall be independent, no detail is provided as to how such independence shall be secured (e.g., security of tenure and salary, oath of office, minimum judicial qualifications etc.). Such independence is vital as GAC members are appointed by the Union Government but the Union Government or its functionaries or instrumentalities may also be parties before a GAC. Further, given that the GACs are authorities ‘under the control of the Government of India’, they have an obligation to abide by the principles of natural justice, due process, and comply with the Fundamental Rights set out in the Constitution. If a GAC directs the removal of content beyond the scope of Article 19(2) of the Constitution, questions of an impermissible restriction on free expression may be raised.
Actual knowledge in 2022: The October 2022 Amendment requires intermediaries to make reasonable efforts to “cause” their users not to upload certain categories of content and ‘act on’ user complaints against content within seventy-two hours. Requiring intermediaries to remove content at the risk of losing safe harbour in circumstances other than the receipt of a court or government order prima facie violates the decision of Shreya Singhal. Further, India’s approach to notice and takedown continues to lack a system for reinstatement of content.
Uncertainty over government blocking power: Section 69A of the IT Act expressly grants the Union Government power to block content, subject to a hearing by the originator (uploader) or intermediary. However, Section 79(3)(b) of the IT Act may also be utilised to require intermediaries to take down content absent some of the safeguards provided in Section 69A. The fact that the Government has relied on both provisions in the past and that it does not voluntarily disclose blocking orders makes a robust legal analysis of the blocking power challenging.
Hearing originators when blocking: The decision in Shreya Singhal and the requirements of due process support the understanding that the originator must be notified and granted a hearing under the IT Blocking Rules prior to their content being restricted under Section 69A. However, evidence suggests that the government regularly does not provide originators with hearings, even where the originator is known to the government. Instead, the government directly communicates with intermediaries away from the public eye, raising rule of law concerns.
Issues with first originators: Both the methods proposed for ‘tracing first originators’ (hashing unique messages and affixing encrypted originator information) are easily circumvented, require significant technical changes to the architecture of messaging services, offer limited investigatory or evidentiary value, and will likely undermine the privacy and security of all users to catch a few bad actors. Given these considerations, it is unlikely that such a measure would satisfy the proportionality test laid out by current Supreme Court doctrine.
Broad and inconsistent injunctions: An analysis of injunctions against online content reveals that the contents of court orders are often sweeping, imposing vague compliance burdens on intermediaries. When issuing injunctions against online content, courts should limit blocking or removals to specific URLs. Further courts should be cognisant of the fact that intermediaries have themselves not committed any wrongdoing, and the effect of an injunction should be seen as meaningfully dissuading users from accessing content rather than an absolute prohibition.
This report was made possible by the generous support we received from National Law University Delhi. CCG would like to thank our Faculty Advisor Dr. Daniel Mathew for his continuous direction and mentorship. This report would not be possible without the support provided by the Friedrich Naumann Foundation for Freedom, South Asia. We are grateful for comments received from the Data Governance Network and its reviewers. CCG would also like to thank Faiza Rahman and Shashank Mohan for their review and comments, and Jhalak M. Kakkar and Smitha Krishna Prasad for facilitating the report. We thank Oshika Nayak of National Law University Delhi for providing invaluable research assistance for this report. Lastly, we would also like to thank all members of CCG for the many ways in which they supported the report, in particular, the ever-present and ever-patient Suman Negi and Preeti Bhandari for the unending support for all the work we do.
Clause 13(2)(d) of the Digital Data Protection Bill, 2022 (“DPDP Bill”) provides for the right to erasure of personal data i.e. “…any data about an individual who is identifiable by or in relation to such data”. The said clause states that a data principal has the right to erasure of personal data as per applicable laws and as prescribed. The clause further provides that such erasure of personal data shall take place after the data fiduciary receives a request for erasure. The precondition for erasure is that the personal data must no longer be necessary for the purpose for which it was processed and that it must not be necessary for any legal purpose either.
This is in many ways a salutary provision. Data principals should have control over their data which includes the right to correct and erase data. This is especially important since it protects individuals from the negative impacts of the widespread availability of personal data on the internet. In today’s digital age, it is easier than ever for personal data to be collected, shared, and used in ways that are harmful or damaging to individuals. The right to erasure aids in countering these negative impacts by giving individuals the power to control their own personal information, and to have it removed from the internet if they choose to do so.
However, this provision can negatively impact several other fundamental rights such as the freedom of speech and right to information, especially when it is abused by powerful figures to silence criticism. For example, if an investigative journalist were to write an article in which they bring to light a government official’s corrupt deeds, the said official would be able to request the data fiduciary to erase such data since they are identifiable by it or are related to it.
This article will seek to address such concerns in two ways. First, it will delve into the safeguards that can be included in the text of Clause 13(2)(d) to ensure that there is an appropriate balance between free speech and privacy. Second, it will recommend that the arbiter of this balance should be an independent authority and not data fiduciaries.
(1) Safeguards
Clause 13(2)(d) is heavily tilted in favor of the privacy interests of the data principal. It does not require data fiduciaries to take into account any other considerations that might have a bearing on the data principal’s erasure request. In order to prevent privacy interests from undermining other rights, the clause should be amended to include various safeguards.
In particular, the clause should require data fiduciaries to consider the free speech rights of other individuals who might be affected by an erasure request. As indicated earlier, journalists may find it difficult to publish critical commentary on powerful public figures if their work is subject to easy erasure. There are also artistic, literary and research purposes for which personal data might be used by other individuals. These are valid uses of personal data that should not be negated simply because of an erasure request.
Data fiduciaries can also be made to consider the following factors through subordinate legislation to harmonize free speech and privacy: (a) the role of the data principal in public life, (b) the sensitivity of the personal data sought to be erased, (c) purpose of processing, (d) public nature of data and (e) relevance of the personal data to the public. Incorporating such safeguards will help ensure that data fiduciaries appropriately balance the right to privacy and the right to speech when they receive erasure requests.
Further, a clearly laid out process for grievance redressal should also be codified. Currently, Clause 13(2)(d) does not provide for an appeal mechanism for erasure requests that have been rejected by data fiduciaries. The clause should explicitly provide that in case the data principal wants to contest the rejection of their erasure request, they can file a complaint with the Data Protection Board (DPB).
(2) Independent Authority
In addition to lacking sufficient safeguards, Clause 13(2)(d) puts the onus on data fiduciaries to decide the validity of erasure requests. Various jurisdictions including the United Kingdom and Spain along with other states from the European Union use this framework. However, giving decision making power directly to Data Fiduciaries will have a chilling effect on speech.
This is because they will tend to mechanically comply with erasure requests in order to escape liability for non-compliance. Data fiduciaries lack the bandwidth needed to properly assess the validity of erasure claims. They are for the most part private businesses with no obligation or commitment to uphold the rights and freedoms of citizens, especially if doing so will entail the expenditure of significant resources.
Consequently, there is a need for a different framework. Clause 13(2)(d) should be amended to provide for the creation of an independent authority which will decide the validity of erasure requests. Such a body should be staffed with free speech and privacy experts who have the incentive and the capability to balance competing privacy and speech considerations.
Conclusion
We can see from the discussion above that the right to erasure provision of the Digital Data Protection Bill, 2022 has failed to strike a sound balance between privacy and free speech. To achieve such a balance, Clause 13(2)(d) should be amended to incorporate various safeguards. Furthermore, an independent authority should be deciding the validity of erasure requests, not data fiduciaries.
A positive obligation with respect to a human right is one that requires States to put into effect both preventive measures against violations (through appropriate legislative, judicial or administrative measures) and remedial measures (access to judicial reform once violations have occurred). This piece examines whether ensuring media diversity can be considered a positive obligation on States under Article 19 of the International Covenant on Civil and Political Rights (“ICCPR”), and if yes, what the scope and nature of this obligation is.
Positive obligation on States to create a favourable environment for sharing diverse views
The right to freedom of speech and expression enshrined under Article 19 of the ICCPR forms the cornerstone of democratic societies. It, along with its corollary freedom of opinion, is vital for the full development of a person and for the true participation in public debate. The ECtHR, in its landmark decision of Dink v. Turkey, has interpreted the right to freedom of expression to include a positive obligation on States to ensure the effective protection of free expression from being wrongfully interfered by private/non-state actors, and for the State itself to create “an enabling environment by allowing for everyone to take part in public debate and express their thoughts and opinions” (¶137). The Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression has also acknowledged that there has been an increasing recognition that States have positive regulatory obligations to promote free speech and expression in online spaces too. The Joint Declaration on Diversity of 2007, a document prepared by several eminent jurists appointed as Representatives or Rapporteurs by the UN, OSCE, OAS, and ACHPR has similarly identified States’ positive obligation to regulate private actors so as to promote diversity in the media and prevent the undue concentration of media ownership.
The requirement for media diversity as a positive obligation on States may also be seen as emanating from interpretations of different international instruments read together, an outcome that has also been reflected in the decisions of different human rights bodies. For instance, a conjunctive reading of Art.19 and Art.2 of the ICCPR (as with the parallel provisions in the UDHR and regional human rights instruments) can be interpreted to show the positive obligation on States to promote media diversity. This interpretation has been endorsed by the Inter-American Commission on Human Rights in inter alia Baruch Ivcher Bronstein v. Peru, (2001) which opined that “…consequently, it is vital that [media] can gather the most diverse information and opinions” (¶149); and by the European Court in Informationsverein Lentia and Others v. Austria (1993) noting, “…Such an undertaking cannot be successfully accomplished unless it is grounded in the principle of pluralism, of which the State is the ultimate guarantor…” (¶38).
The positive obligation includes within its ambit an obligation to prevent undue concentration within media eco-systems
A positive obligation on the State to foster an environment where a diversity of ideas and opinions (media diversity) is available to the public can entail a very wide array of obligations on the State. For instance, this raises questions regarding the extent or the scope of this obligation in the regulation of social media intermediaries who have managed to accumulate significant control within the online media space. This sort of control could be seen as giving them the ability to behave in a near monopolistic manner. The Centre for Law & Democracy, in February, 2022 gave their submissions on the Practical Application of the Guiding Principles on Business and Human Rights to the Activities of Technology Companies where they opined inter alia that States can be obligated to undertake measures to promote diversity in an online space that has seen high market concentration by large social media companies.
Concentration within media eco-systems is antithetical to the idea of media diversity
Given that a positive obligation to promote media diversity exists, a necessary corollary of this would be the need to prevent undue concentration within media eco-systems. According to UNESCO, undue concentration in media refers to when one corporate body (or individual) “exercises overall control over an important part of an overall media market”. This would prevent and hinder the ability of people to receive information from multiple sources, which is crucial for the true exercise of the freedom of speech. This is because media monopoly can cloud the ‘marketplace of ideas’, and according to the Special Rapporteur for Freedom of Expression, “leads to the uniformity of the content that they produce or disseminate”. Furthermore according to UNESCO, a media monopoly poses a threat to not just the freedom of expression but by extension also to democracy as it hinders the ability of media to reflect the variety of opinions and ideas generated in the society as a whole.
Obligation to monitor and restrict M&As in the media space
In 2007, the Joint Declaration on Diversity (by the Special Rapporteurs of the UN, OAS and ACHPR and the OSCE Representative on freedom of the media) in broadcasting emphasized the requirement to put in place anti-monopoly (both horizontal and vertical) rules, including ‘stringent requirements’ of transparency enforced through active monitoring. This also covered the need to prevent powerful combinations as a result of merger activity in the media space. The Committee of Ministers of the Council of Europe has emphasized the need for licensing to be made contingent on media platform owners acting in harmony with the requirement to ensure media diversity. UNESCO’s Media Development Indicators, also acknowledge that States are required to prevent monopolies or oligopolies and must take this into account during the provision/renewal of license. The measures that States were required to take to promote media diversity and prevent monopoly were called ‘special measures’ (in the Joint Declaration on the Protection of Freedom of Expression and Diversity in the Digital Terrestrial Transition), going beyond those already existing in commercial sectors, which indicates a recognition of the need to secure media pluralism inter alia through ensuring competitiveness in the space.
Conclusion
A State’s positive obligations under the right to free speech and expression can be viewed as emanating directly from treaty obligations and has also been widely interpreted by a multitude of judicial decisions and eminent jurists. Acknowledging these as sources of international law under Articles 38(1)(a) and 38(1)(d) of the ICJ Statute we can argue that a State’s positive obligations under Art. 19 of the ICCPR and analogous free speech protections under international law must also include within their ambit obligations to ensure media diversity. This includes the protection of both, the rights of the speaker and the audience, under the right to freedom of speech and expression. Some ways in which this can be ensured is through allocation of funds specifically for public interest content and other at-risk sectors; establish holistic and functional market concentration monitoring systems; and also delegate, through co-regulation or self-regulation, a part of the State’s positive obligation directly to the media platforms itself to ensure diversity in its operations. The measures undertaken must be carefully designed and should fulfill the aims of promoting diversity, avoiding monopolistic behaviour, and not put at risk the independence of the media.
On 6 June 2022, the Ministry of Electronics and Information Technology (“MeitY”), released the proposed amendments for Part 1 and Part II of the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (“2021 IT Rules”). CCG submitted its comments on the proposed amendments to the 2021 IT Rules, highlighting its key feedback and key concerns. The comments were authored by Vasudev Devadasan and Bilal Mohamed and reviewed and edited by Jhalak M Kakkar and Shashank Mohan.
The 2021 IT Rules were released in February last year, and Part I and II of the Guidelines set out the conditions intermediaries must satisfy to avail of legal immunity for hosting unlawful content (or ‘safe harbour’) under Section 79 of the Information Technology Act, 2000 (“IT Act”). The 2021 IT Rules have been challenged in several High Courts across the country, and the Supreme Court is currently hearing a transfer petition on whether these actions should be clubbed and heard collectively by the apex court. In the meantime, the MeitY has released the proposed amendments to the 2021 IT Rules which seek to make incremental but significant changes to the Rules.
CCG’s comments to the MeitY can be summarised as follows:
Dilution of safe harbour in contravention of Section 79(1) of the IT Act
The core intention behind providing intermediaries with safe harbour under Section 79(1) of the IT Act is to ensure that intermediaries do not restrict the free flow of information online due to the risk of being held liable for the third-party content uploaded by users. The proposed amendments to Rules 3(1)(a) and 3(1)(b) of the 2021 IT Rules potentially impose an obligation on intermediaries to “cause” and “ensure” their users do not upload unlawful content. These amendments may require intermediaries to make complex determinations on the legality of speech and cause online intermediaries to remove content that may carry even the slightest risk of liability. This may result in the restriction of online speech and the corporate surveillance of Indian internet users by intermediaries. In the event that the proposed amendments are to be interpreted as not requiring intermediaries to actively prevent users from uploading unlawful content, in such a situation, we note that the proposed amendments may be functionally redundant, and we suggest they be dropped to avoid legal uncertainty.
Concerns with Grievance Appellate Committee
The proposed amendments envisage one or more Grievance Appellate Committees (“GAC”) that sit in appeal of intermediary determinations with respect to content. Users may appeal to a GAC against the decision of an intermediary to not remove content despite a user complaint, or alternatively, request a GAC to reinstate content that an intermediary has voluntarily removed or lift account restrictions that an intermediary has imposed. The creation of GAC(s) may exceed Government’s rulemaking powers under the IT Act. Further, the GAC(s) lack the necessary safeguards in its composition and operation to ensure the independence required by law of such an adjudicatory body. Such independence and impartiality may be essential as the Union Government is responsible for appointing individuals to the GAC(s) but the Union Government or its functionaries or instrumentalities may also be a party before the GAC(s). Further, we note that the originator, the legality of whose content is at dispute before a GAC, has not expressly been granted a right to hearing before the GAC. Finally, we note that the GAC(s) may lack the capacity to deal with the high volume of appeals against content and account restrictions. This may lead to situations where, in practice, only a small number of internet users are afforded redress by the GAC(s), leading to inequitable outcomes and discrimination amongst users.
Concerns with grievance redressal timeline
Under the proposed amendment to Rule 3(2), intermediaries must acknowledge the complaint by an internet user for the removal of content within 24 hours, and ‘act and redress’ this complaint within 72 hours. CCG’s comments note that 72-hour timeline to address complaints proposed by the amendment to Rule 3(2) may cause online intermediaries to over-comply with content removal requests, leading to the possible take-down of legally protected speech at the behest of frivolous user complaints. Empirical studies conducted on Indian intermediaries have demonstrated that smaller intermediaries lack the capacity and resources to make complex legal determinations of whether the content complained against violates the standards set out in Rule 3(1)(b)(i)-(x), while larger intermediaries are unable to address the high volume of complaints within short timelines – leading to the mechanical takedown of content. We suggest that any requirement that online intermediaries address user complaints within short timelines could differentiate between types of content that are ex-facie (on the face of it) illegal and causes severe harm (e.g., child-sex abuse material or gratuitous violence), and other types of content where determinations of legality may require legal or judicial expertise, like copyright or defamation.
Need for specificity in defining due diligence obligations
Rule 3(1)(m) of the proposed amendments requires intermediaries to ensure a “reasonable expectation of due diligence, privacy and transparency” to avail of safe harbour; while Rule 3(1)(n) requires intermediaries to “respect the rights accorded to the citizens under the Constitution of India.” These rules do not impose clearly ascertainable legal obligations, which may lead to increased compliance burdens, hamper enforcement, and results in inconsistent outcomes. In the absence of specific data protection legislation, the obligation to ensure a “reasonable expectation of due diligence, privacy and transparency” is unclear. The contents of fundamental rights obligations were drafted and developed in the context of citizen-State relations and may not be suitable or aptly transposed to the relations between intermediaries and users. Further, the content of ‘respecting Fundamental Rights’ under the Constitution is itself contested and open to reasonable disagreement between various State and constitutional functionaries. Requiring intermediaries to uphold such obligations will likely lead to inconsistent outcomes based on varied interpretations.
On May 11, 2022, the Supreme Court issued a consequential interim order (“Order”) in a batch of petitions challenging Section 124A of the Indian Penal Code, 1860 (“IPC”), which penalises the offence of ‘sedition’. The Order grants the Central Government time to re-examine the validity of Section 124A, and whilst this process is underway, effectively suspends the provision’s operation through these directions:
it is “hop[ed] and expect[ed]” that the Central and State Governments will refrain from registering fresh ‘first information reports’, continuing investigations, and taking coercive measures under Section 124A;
persons subsequently charged with sedition can approach courts for appropriate relief and their plea must be considered in light of the Order; and
pending legal proceedings relating to sedition are to be kept in abeyance. However, proceedings under other penal laws can continue if no prejudice is caused to the accused.
After briefly explaining the offence of sedition and summarising the facts leading up to the current constitutional challenges, this post analyses the Order along three lines – its impact on fundamental rights; relevance for persons facing prosecution and incarceration for sedition; and possible hurdles to its effective implementation.
Section 124A’s constitutionality was tested in 1962 by a five-judge bench (i.e., a constitution bench) of the Supreme Court in Kedar Nath Singh (analysed here). The provision was declared valid since it was enacted “in the interest of… public order” (¶26). Acknowledging that a broad (or arguably even plain) interpretation of the offence may constitute an unreasonable restriction on free speech under Article 19(2), the Supreme Court limited Section 124A’s applicability to activities inciting violence or having the “tendency to create public disorder…” (¶27). Constitutional law scholars have pointed out that the Supreme Court’s interpretation of Section 124A in Kedar Nath is inconsistent with the provision’s text, which remains overbroad. This interpretative gap has been suggested as a reason for the provision’s misapplication.
Material facts
Against the backdrop of a sharp increase in the registration of sedition cases, several petitioners, challenged the constitutionality of Section 124A before the Supreme Court in 2021. Multiple petitioners (here and here) argued that:
Kedar Nath was incorrectly decided because the Supreme Court overlooked the legal standard to legitimately curb speech under Article 19(2). As espoused in Ram Manohar Lohia (decided two years earlier in 1960, analysed here), this standard was a “proximate relationship” (¶13) between speech and apprehended public disorder, not a mere “tendency to create public disorder” (¶27, Kedar Nath).
Section 124A is inconsistent with later Supreme Court decisions on free speech, including the seminal Shreya Singhal ruling (analysed here) wherein it was held that only incitement to public disorder, and not mere advocacy or discussion a particular cause can be curbed by law.
Charges under section 124A have been frequently filed against journalists, politicians, and artists (among others), leading to a chilling effect on free speech.
Section 124A’s broadness violates India’s international law commitments, particularly under Article 19 of the ICCPR, requiring restrictions on the freedom of expression to be necessary and unambiguous.
These petitions were heard by different three-judge benches of the Supreme Court. However, as Section 124A’s constitutionality was upheld by five judges in Kedar Nath, according to the larger bench rule, only a constitution bench having a minimum of seven judges can invalidate Section 124A after conclusively overruling Kedar Nath. Thus, the preliminary question before the bench that passed the Order, comprising of Chief Justice N.V. Ramana, Justice Surya Kant and Justice Hima Kohli, was whether to refer the present-day challenges to a larger bench. The petitions were heard over the course of about a year:
Impact on fundamental rights
The Supreme Court’s intention to prevent unjustified curtailment of fundamental rights through the frequent use of Section 124A is both laudable and clearly represented in its Order. To this end, the Court’s first direction restraining the registration of fresh sedition cases is particularly significant. However, the deferential language used here (“We hope and expect…”) leaves room for ambiguity because:
As pointed out by a commentator, legally even softly worded requests in judicial orders must be mandatorily followed. However, the language used in the Order may create needless confusion regarding the binding nature of the Supreme Court’s direction amongst authorities required to abide by it, potentially hampering compliance. Pertinently, despite the purpose of this direction being to restrain Section 124A’s invocation, the word ‘stay’ is conspicuously absent. This contrasts with the univocal language previously used by the Supreme Court to stay the operation of contentious farm laws. Given the significance of this direction, the Supreme Court could have phrased the Order in clearer terms.
Admittedly, the absence of blanket top-down directions affords the State and lower courts discretion to evaluate situations on a case-to-case basis. However, the Supreme Court’s deference towards the State in operationalising its well-intentioned directions may not be beneficial given the periodic application of Section 124A, particularly in recent years. According to the online publication Article 14, there has been a 28% increase in sedition cases between 2014 and 2020, several of which were registered simply for criticising governments and politicians.
Additionally, the Supreme Court’s failure to record detailed reasons justifying its directions dilutes the Order’s precedential value. Had the Supreme Court provided legal reasoning for arriving at its “prima facie opinion” that Section 124A was unconstitutional – such as the provision’s vagueness or incompatibility with subsequent free speech standards – the Order could have been relied upon to buttress similar arguments in cases where other ambiguous penal laws have been challenged. Absent reasons, however, the Order remains highly contextual and neither clarifies nor expands India’s free speech protections.
Admittedly, the Supreme Court’s reluctance to elaborate on its reasons in the Order may be defensible since this was only an interim pronouncement and not a final verdict that reconsidered Section 124A’s constitutionality. However, a reasoned judgment conclusively invalidating Section 124A (as opposed to a government amendment or repeal) would have had a far more enduring impact on Indian free speech jurisprudence. The benefits of such a verdict could have been considered by the Supreme Court before granting the Government’s request for time to re-evaluate the validity of this offence. In light of the Supreme Court’s observations against Section 124A in the Order, it is hoped that the Court will consider issuing a detailed judgment disposing these challenges and conclusively striking-down Section 124A, in case the provision is not repealed by the Government within a reasonable duration.
Relevance for accused persons
The Order provides a strong basis for persons facing prosecution and incarceration for sedition to seek the suspension of legal proceedings and bail. This is evidenced by the Rajasthan High Court’s order directing the state police not to investigate sedition allegations against journalist Aman Chopra because of the Order, and a similar pronouncement by the Kerala High Court staying sedition proceedings against filmmaker Aisha Sultana.
However, for those charged with other offences in addition to sedition, especially under specialised penal statutes like the Unlawful Activities (Prevention) Act, 1967 (“UAPA”), grant of bail is still subject to the stringent conditions prescribed in these statutes. For instance, under Section 43D(5) of the UAPA, bail can be denied if the presiding court is satisfied that the prosecution’s allegations are “prima facie true”. While interpreting this provision in its Watali verdict (analysed here and here), the Supreme Court prohibited a detailed examination of even the prosecution’s evidence for deciding bail pleas, making it far easier to establish a prima facie case and significantly harder for the accused to be granted bail.
Research by Article 14 suggests that in nearly 60% of sedition cases filed between 2010 to 2020, offences from other statutes, including the UAPA and the Information Technology Act, 2000 (“IT Act”), have been invoked alongside sedition. For individuals accused of more than just sedition, including journalist Siddique Kappan who is still in jail on charges under the UAPA, suspension of sedition alone might not bring much reprieve unless it is accompanied by wider reforms or stronger protections by courts.
Challenges in implementation
For effective implementation, the Order would have to be promptly communicated to grassroots-level law enforcement authorities spanning over 16,955 police stations. This is an uphill task considering India lacks a standardised framework for communicating judicial orders on constitutional issues to concerned authorities. The lack of such a framework has been cited by digital rights advocates as the primary reason behind the repeated invocation of the unconstitutional Section 66A of the IT Act, despite its invalidation in Shreya Singhal.
Given this context, the Supreme Court should have considered developing an ad hoc mechanism for ensuring compliance with its Order. It could potentially have directed concerned governments to communicate the Order to law enforcement authorities and to file compliance affidavits. Previously, in Prakash Singh v. UoI (analysed here), the filing of such affidavits was ordered to ensure adherence to the Supreme Court’s suggested police reforms.
Although the Supreme Court’s Order will likely bring some reprieve to persons facing sedition charges, unfortunately, the Order has limited precedential value given the lack of detailed legal reasoning. Further, the Supreme Court’s direction restraining registration of fresh sedition cases, though binding, should have been worded clearly to avoid ambiguity regarding its mandatory nature. Lastly, in the absence of a framework for communicating the Order to law enforcement authorities, implementing it would be challenging. As a mitigant, petitioners may consider requesting the Supreme Court to direct concerned governments to expeditiously communicate the Order to law enforcement authorities across the country and subsequently file affidavits demonstrating compliance.
Part I of this two part-series examined the contours of Rule 16 of the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (“2021 IT Rules”), and the Bombay High Court’s rationale for refusing to stay the rule in the Leaflet case. This second part examines the legality and constitutionality of Rule 16. It argues that the rule’s constitutionality may be contested because it deprives impacted content publishers of a hearing when their content is restricted. It also argues that the MIB should provide information on blocking orders under Rule 16 to allow them to be challenged, both by users whose access to information is curtailed, and by publishers whose right to free expression is restricted.
Rule 16’s legality
At its core, Rule 16 is a legal provision granting discretionary authority to the government to take down content. Consistently, the Supreme Court (“SC”) has maintained that to be compliant with Article 14, discretionary authority must be backed by adequate safeguards.[1] Admittedly, Rule 16 is not entirely devoid of safeguards since it envisages an assessment of the credibility of content blocking recommendations at multiple levels (refer Part I for context). But this framework overlooks a core principle of natural justice – audi alteram partem (hear the other side) – by depriving the impacted publishers of a hearing.
In Tulsiram Patel, the SC recognised principles of natural justice as part of the guarantee under Article 14 and ruled that any law or state action abrogating these principles is susceptible to a constitutionality challenge. But the SC also found that natural justice principles are not absolute and can be curtailed under exceptional circumstances. Particularly, audi alteram partem, can be excluded in situations where the “promptitude or the urgency of taking action so demands”.
Arguably, the suspension of pre-decisional hearings under Rule 16 is justifiable considering the rule’s very purpose is to empower the Government to act with alacrity against content capable of causing immediate real-world harm. However, this rationale does not preclude the provision of a post-decisional hearing under the framework of the 2021 IT Rules. This is because, as posited by the SC in Maneka Gandhi (analysed here and here), the “audi alteram partem rule is sufficiently flexible” to address“the exigencies of myriad kinds of situations…”. Thus, a post-decisional hearing to impacted stakeholders, after the immediacy necessitating the issuance of interim blocking directions had subsided, could have been reasonably accommodated within Rule 16. Crucially, this would create a forum for the State to justify the necessity and proportionality of its speech restriction to the individuals’ impacted (strengthening legitimacy) and the public at large (strengthening the rule of law and public reasoning). Finally, in the case of ex-facie illegal content, originators are unlikely to avail of post-facto hearings, mitigating concerns of a burdensome procedure.
Rule 16’s exercise by MIB
Opacity
MIB has exercised its power under Rule 16 of the 2021 IT Rules on five occasions. Collectively, it has ordered the blocking of approximately 93 YouTube channels, 6 websites, 4 Twitter accounts, and 2 Facebook accounts. Each time, MIB has announced content blocking only through press releases after theorders were passed but has not disclosed the actual blocking orders.
MIB’s reluctance to publish its blocking orders renders the manner it is exercising power under Rule 16 opaque. Although press statements inform the public that content has been blocked, blocking orders are required (under Rule 16(2) and Rule 16(4)) to record the reasons for which the content has been blocked. As discussed above, this limits the right to free expression of the originators of the content and denies them the ability to be heard.
Additionally, content recipients, whose right to view content and access information is curtailed through such orders, are not being made aware of the existence of these orders by the Ministry directly. Pertinently, the 2021 IT Rules appear to recognise the importance of informing users about the reasons for blocking digital content. This is evidenced by Rule 4(4), which requires ‘significant social media intermediaries’ to display a notice to users attempting to access proactively disabled content. However, in the absence of similar transparency obligations upon MIB under the 2021 IT Rules, content recipients aggrieved by the Ministry’s blocking orders may be compelled to rely on the cumbersome mechanism under the Right to Information Act, 2005 to seek the disclosure of these orders to challenge them.
Although the 2021 IT Rules do not specifically mandate the publication of blocking orders by MIB, this obligation can be derived from the Anuradha Bhasin verdict. Here, in the context of the Telecom Suspension Rules, the SC held that any order affecting the “lives, liberty and property of people” must be published by the government, “regardless of whether the parent statute or rule prescribes the same”. The SC also held that the State should ensure the availability of governmental orders curtailing fundamental rights unless it claims specific privilege or public interest for refusing disclosure. Even then, courts will finally decide whether the State’s claims override the aggrieved litigants’ interests.
Considering the SC’s clear reasoning, MIB ought to make its blocking orders readily available in the interest of transparency, especially since a confidentiality provision restricting disclosure, akin to Rule 16 of the Information Technology (Procedure and Safeguards for Blocking for Access of Information by Public) Rules, 2009 (“2009 Blocking Rules”), is absent in the 2021 IT Rules.
Overuse
Another concerning trend is MIB’s invocation of its emergency content-blocking power as the norm rather than the exception it was meant to be. For context, the 2021 IT Rules provide a non-emergency blocking process under Rules 14 and 15, whereunder impacted publishers are provided a pre-decisional hearing before an Inter-Departmental Committee required to be constituted under Rule 13(1)(b). However, thus far, MIB has exclusively reliedon its emergency power to block ostensibly problematic digital content, including fake news.
While the Bombay High Court in the Leaflet casedeclined to expressly stay Rule 14 (noting that the Inter-Departmental Committee was yet to be set up) (¶19), the High Court’s stay on Rule 9(3) creates a measure of ambiguity as to whether Rules 14 and 15 are currently in effect. This is because Rule 9(3) states that there shall be a government oversight mechanism to “ensure adherence to the Code of Ethics”. A key part of this mechanism is the Inter-Departmental Committee whose role is to decide “violation[s] or contravention[s] of the Code of Ethics” (Rule 14(2)). The High Court even notes that it is “incomprehensible” how content may be taken down under Rule 14(5) for violating the Code of Ethics (¶27). Thus, despite the Bombay High Court’s refusal to stay Rule 14, it is arguable that the High Court’s stay on the operation of Rule 9(3) to prevent the ‘Code of Ethics’ from being applied against online news and curated content publishers, may logically extend to Rule 14(2) and 15. However, even if the Union were to proceed on a plain reading of the Leaflet order and infer that the Bombay High Court did not stay Rules 14 and 15, it is unclear if the MIB has constituted the Inter-Departmental Committee to facilitate non-emergency blocking.
MeitY has also liberally invoked its emergency blocking power under Rule 9 of the 2009 Blocking Rules to disable access to content. Illustratively, in early 2021 Twitter received multiple blocking orders from MeitY, at least two of which were emergency orders, directing it to disable over 250 URLs and a thousand accounts for circulating content relating to farmers’ agitation against contentious farm laws. Commentators have also pointed out that there are almost no recorded instances of MeitY providing pre-decisional hearings to publishers under the 2009 Blocking Rules, indicating that in practice this crucial safeguard has been rendered illusory.
Conclusion
Evidently, there is a need for the MIB to be more transparent when invoking its emergency content-blocking powers. A significant step forward in this direction would be ensuring that at least final blocking orders, which ratify emergency blocking directions, are made readily available, or at least provided to publishers/originators. Similarly, notices to any users trying to access blocked content would also enhance transparency. Crucially, these measures would reduce information asymmetry regarding the existence of blocking orders and allow a larger section of stakeholders, including the oft-neglected content recipients, the opportunity to challenge such orders before constitutional courts.
Additionally, the absence of hearings to impacted stakeholders, at any stage of the emergency blocking process under Rule 16 of the 2021 IT Rules limits their right to be heard and defend the legality of ‘at-issue’ content. Whilst the justification of urgency may be sufficient to deny a pre-decisional hearing, the procedural safeguard of a post-decisional hearing should be incorporated by MIB.
The aforesaid legal infirmities plague Rule 9 of the 2009 Blocking Rules as well, given its similarity with Rule 16 of the 2021 IT Rules. The Tanul Thakur case presents an ideal opportunity for the Delhi High Court to examine and address the limitations of these rules. Civil society organisations have for years advocated (here and here) for incorporation of a post-decisional hearing within the emergency blocking framework under the 2009 Blocking Rules too. Its adoption and diligent implementation could go a long way in upholding natural justice and mitigating the risk of arbitrary content blocking.
[1]State of Punjab v. Khan Chand, (1974) 1 SCC 549; Virendra v. The State of Punjab & Ors., AIR 1957 SC 896; State of West Bengal v. Anwar Ali, AIR 1952 SC 75.
The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (“2021 IT Rules”) were challenged before several High Courts (refer here and here) almost immediately after their promulgation. In one such challenge, initiated by the publishers of the online news portal ‘The Leaflet’, the Bombay High Court, by an order dated August 14, 2021, imposed an interim stay on the operation of Rules 9(1) and (3) of the 2021 IT Rules. Chiefly, this was done because these provisions subject online news and curated content publishers to a vaguely worded ‘code of ethics’, adherence to which would have had a ‘chilling effect’ on their freedom of speech. However, the Bombay High Court refused to stay Rule 16 of these rules, which empowers the Ministry of Information and Broadcasting (“MIB”) to direct blocking of digital content during an “emergency” where “no delay is acceptable”.
Part I of this two-part series, examines the contours of Rule 16 and argues that the Bombay High Court overlooked the procedural inadequacy of this rule when refusing to stay the provision in the Leaflet case. Part II assesses the legality and constitutionality of the rule.
Overview of Rule 16
Part III of the 2021 IT Rules authorises the MIB to direct blocking of digital content in case of an ‘emergency’ in the following manner:
The MIB has correctly noted that Rule 16 is modelled after Rule 9 of the Information Technology (Procedure and Safeguards for Blocking for Access of Information by Public) Rules, 2009 (“2009 Blocking Rules”) (analysed here), and confers upon the MIB similar emergency blocking powers which the Ministry of Electronics and Information Technology (“MeitY”) has possessed since 2009. Both provisions confer discretion upon authorised officers to determine what constitutes an emergency but fail to provide a hearing to impacted publishers or intermediaries at any stage.
Judicial findings on Rule 16
The Bombay High Court’s order in the Leaflet case is significant since it is the first time a constitutional court has recorded its preliminary findings on the rule’s legitimacy. Here, the Bombay High Court refused to stay Rule 16 primarily for two reasons. First, the High Court held that Rule 16 of the 2021 IT Rules is substantially similar to Rule 9 of the 2009 Blocking Rules, which is still in force. Second, the grounds upon which Rule 16 permits content blocking are coextensive with the grounds on which speech may be ‘reasonably restricted’ under Article 19(2) of the Indian Constitution. Respectfully, the plausibility of this reasoning is contestable:
Equivalence with the 2009 Blocking Rules: Section 69A of the IT Act and the 2009 Blocking Rules were previously challenged in Shreya Singhal, where both were upheld by the Supreme Court (“SC”). However, establishing an equivalence between Rule 16 of the 2021 IT Rules and Rule 9 of the 2009 Blocking Rules to understand the constitutionality of the former would have been useful only if Shreya Singhal contained a meaningful analysis of Rule 9. However, the SC did not examine this rule but rather broadly upheld the constitutionality of the 2009 Blocking Rules as a whole due to the presence of certain safeguards including: (a) the non-emergency process for content blocking under the 2009 Blocking Rules includes a pre-decisional hearing to identified intermediaries/originators before content was blocked; and (b) the 2009 Blocking Rules mandate the recording of reasons in blocking orders so that they may be challenged under Article 226 of the Constitution
However, the SC did not consider that the emergency blocking framework under Rule 9 of the 2009 Blocking Rules not only allows MeitY to bypass the essential safeguard of a pre-decisional hearing to impacted stakeholders but also fails to provide them with either a written order or a post-decisional hearing. It also did not address that Rule 16 of the 2009 Blocking Rules, which mandates confidentiality of blocking requests and subsequent actions, empowers MeitY to refuse disclosure of blocking orders to impacted stakeholders thus depriving them of the opportunity to challenge such orders.
Thus, the Bombay High Court’s attempt in the Leaflet case to claim equivalence with Rule 9 of the 2009 Blocking Rules as a basis to defend the constitutionality of Rule 16 of the 2021 IT Rules was inapposite since Rule 9 itself was not substantively reviewed in Shreya Singhal, and its operation has since been challenged on constitutional grounds.
Procedural safeguards: Merely because Rule 16 of the 2021 IT Rules permits content blocking only under the circumstances enumerated under Article 19(2), does not automatically render it procedurally reasonable. In People’s Union of Civil Liberties (“PUCL”) the SC examined the procedural propriety of Section 5(2) of the Telegraph Act, 1885, which permits phone-tapping. Even though this provision restricts fundamental rights only on constitutionally permissible grounds, the SC found that substantive law had to be backed by adequate procedural safeguards to rule out arbitrariness. Although the SC declined to strike down Section 5(2) in PUCL, it framed interim guidelines to govern the provision’s exercise to compensate for the lack of adequate safeguards.
Since Rule 16 restricts the freedom of speech, its proportionality should be tested as part of any meaningful constitutionality analysis. To be proportionate, restrictions on fundamental rights must satisfy four prongs[1]: (a) legality – the requirement of a law having a legitimate aim; (b) suitability – a rational nexus between the means adopted to restrict rights and the end of achieving this aim, (c) necessity – proposed restrictions must be the ‘least restrictive measures’ for achieving the aim; and (d) balancing – balance between the extent to which rights are restricted and the need to achieve the aim. Justice Kaul’s opinion in Puttaswamy (9JB)also highlights the need for procedural safeguards against the abuse of measures interfering with fundamental rights (para 70 Kaul J).
Arguably, by demonstrating the connection between Rule 16 and Article 19(2), the Bombay High Court has proven that Rule 16 potentially satisfies the ‘legality’ prong. However, even at an interim stage, before finally ascertaining Rule 16’s constitutionality by testing it against the other proportionality parameters identified above, the Bombay High Court should have considered whether the absence of procedural safeguards under this rule merited staying its operation.
For these reasons, the Bombay High Court could have ruled differently in deciding whether to stay the operation of Rule 16 in the Leaflet case. While these are important considerations at the interim stage, ultimately the larger question of constitutionality must be addressed. The second post in this series will critically examines the legality and constitutionality of Rule 16.
[1]Modern Dental College and Research Centre and Ors. v. State of Madhya Pradesh and Ors., (2016) 7 SCC 353; Justice K.S. Puttaswamy & Ors. v. Union of India (UOI) & Ors., (2019) 1 SCC 1; Anuradha Bhasin and Ors. v. Union of India (UOI) & Ors., (2020) 3 SCC 637.
On 25 February 2021, the Central Government notified the Information Technology (Guidelines for Intermediaries and Digital Media Ethics Code) Rules, 2021 (‘2021 Rules’). These Rules have been the subject of much controversy as social media intermediaries and media houses have challenged them in various High Courts across the country. The Bombay High Court in AGIJ Promotion of Nineteenonea Media v Union of Indiastayed the operation of Rule 9(1) and Rule 9(3), the former provision mandating adherence to the ‘Code of Ethics’ and the latter creating a three-tiered structure to regulate online curated content. The High Court held that these rules contravened Article 19(1)(a) of the Constitution and transgressed the rule-making power delegated by the Information Technology Act, 2000 (‘IT Act’). This was affirmed by the Madras High Court in Digital News Publishers Association v Union of India,which noted that the order passed by the Bombay High Court had a pan-India effect.
While the Information Technology (Intermediary Guidelines), 2011 applied solely to intermediaries, the 2021 Rules cover both intermediaries and publishers of digital content, including OTT platforms (that fall under ‘publisher of online curated content). At the outset, the departure from utilising existing legislations such as the Cinematograph Act, 1952, or the Cable Television Networks (Regulation) Act, 1955, and invoking the IT Act to regulate publishers of film and television is curious. The aforementioned Bombay High Court judgement addressed this, observing that fields which stood occupied by independent legislations could not possibly be brought within the purview of the 2021 Rules.
The regulation of OTT platforms assumes particular significance given the recent controversies concerning web series that allegedly contain objectionable content or offend religious beliefs. For instance, FIRs were lodged against the makers of the web series Tandav, which led to Amazon Prime Video’s India head moving the Supreme Court for protection against arrest. Similarly, Netflix’s A Suitable Boy also triggered a police case after a political leader found the scene wherein the protagonist kissed a Muslim boy at a Hindu temple objectionable. FIRs have also been registered against the makers and producers of Mirzapur for offending religious beliefs, and a petition has been filed before the Supreme Court for portraying the Uttar Pradesh district in a negative manner.
This blog will first set out how the 2021 Rules are applicable to OTT platforms. Second, it will examine whether the regulatory mechanisms conceived by the 2021 Rules provide unduly broad censorial powers to the Central Government, potentially threatening free speech and expression guaranteed by the Indian Constitution.
The 2021 Rules and OTT Platforms In February 2019, the Ministry of Electronics and Information Technology (‘MeitY’) told the Delhi High Court that the IT Act already provided stringent provisions for website blocking (under Section 69A) in case of illegal content on OTT Platforms and therefore, no mandamus could be issued to the Centre for framing general guidelines or separate provisions for OTT content. However, in February 2021, amidst rising controversies revolving around various shows, the Centre notified the 2021 Rules, Part III of which is titled “Code of Ethics and Procedure and Safeguard in Relation to Digital/Online Media”.
Rule 2(u) of the 2021 Rules defines “publisher of online curated content” as any publisher who makes available to users, on demand, audio-visual content (that is owned or licensed by the publisher) via a computer resource over the internet. OTT platforms such as Netflix, Amazon Prime Video, and Disney+Hotstar squarely fall within the ambit of such ‘publishers of online curated content’. Under Rule 8(2) of the 2021 Rules, such publishers are bound by Part III of the 2021 Rules, while Rule 9 requires such publishers to adhere to the ‘Code of Ethics’ found in the Appendix to the 2021 Rules. This Code lays down five broad principles, ranging from age classification of content to exercising due caution and discretion while depicting India’s multi-cultural background.
Perhaps the most salient feature of Part III is its three-tier structure for redressal of grievances against content, which is applicable to both publishers of news and current affairs and publishers of online curated content. Any complaints that a publisher’s content violates the Code of Ethics or that the publisher is in breach of any rule in Part III of the 2021 Rules are addressed through the following structure:
Beyond the 2021 Rules, there will also be an establishment of an “Online Grievance Portal” by the Ministry of Information & Broadcasting (‘MIB’) where any person who objects to the content of a publisher can register their grievance. This grievance will be electronically directed to the publisher, the Ministry, as well as the self-regulating body.
The impact of the 2021 Rules Films released in theatres in India are subjected to pre-certification from the Central Board of Film Certification (‘CBFC’) as per the Cinematograph Act, 1952, and television programmes are governed as per the Cable Television Network (Regulation) Act, 1995. However, OTT platforms, till now, escaped the scrutiny of the law due to an absence of clarity as to which Ministry would regulate them, i.e., the MietY or the MIB. The matter was resolved in November 2020 when the Government of India (Allocation of Business) Rules, 1961 were amended to include “Films and Audio-Visual programmes made available by online content providers” within the ambit of the MIB.
Overregulation and independent regulatory bodies The 2021 Rules pose a danger of overregulation vis-a-vis OTT platforms; they promote self-censorship and potentially increase government oversight over digital content. Beginning with the second-tier of the mechanism established by the 2021 Rules, it requires a self-regulatory body to be set up which is to be headed by a Supreme Court or High Court Judge, or an independent eminent person from the field of media, broadcasting, entertainment, child rights, human rights or such other field; the members of this body, not exceeding six, are experts from various fields. Rule 12(3) dictates that the self-regulating body, after constitution, needs to register itself with the MIB. However, this registration is predicated upon the subjective satisfaction of the MIB that the body has been constituted according to Rule 12(2) and has agreed to perform functions laid down in sub-rules (4) and (5), which effectively hinders the independence of the body as the Rules fail to circumscribe the discretion that can be exercised by MIB in refusing registration to the body.
This self-regulating body can sit in appeal as well as issue guidance or advisories to the publishers, including requiring the issuance of apologies or inclusion of warning cards by publishers. However, decisions pertaining to the need to take action to delete or modify content, or instances where the publisher fails to comply with guidance or advisories of the body, are to be referred to the Oversight Mechanism under Rule 13 [Rules 12(5)(e) and 12(7)].
Additional concerns arise at Level III – the Oversight Mechanism under Rule 13. This Oversight Mechanism requires the MIB to form an Inter-Departmental Committee (‘IDC’), which shall consist of representatives from various other Ministries; the Chairperson
of this Committee is an Authorised Officer appointed by the MIB. Rule 14(2) stipulates that the Committee shall meet periodically to hear complaints arising out of grievances with respect to decisions taken at Level I or II, or complaints referred to it directly by the MIB. This may pose certain challenges — as the IDC, which is constituted and chaired by the MIB, and consists of individuals from other Ministries, will effectively also preside over complaints referred to it by the MIB. Furthermore, the recommendations of the IDC are made to the MIB itself for issuance of appropriate orders and directions for compliance. This has the potential to create a conflict of interest, and it violates the principle of natural justice that one cannot be a judge in their own case.
A bare perusal of the functions of Level II and Level III portrays that the powers bestowed upon the self-regulating body and the IDC overlap to a great extent. The self-regulating body may be rendered irrelevant as decisions regarding modification or removal of content or punishment of the publisher for failure to comply rest with the IDC. As the IDC is constituted by the MIB and its recommendations are referred to the MIB for issuance of orders to the publishers, for all intents and purposes, the Central Government has the final say in the online content that can be published by OTT platforms. This may make publishers wary and could have a chilling effect on freedom of speech and expression as content unfavourable to or critical of the government in power may be referred to the IDC/MIB and blocked.
The IDC has considerable discretion when it comes to its position as an Appellate Authority. More importantly, Rule 16 allows the Authorised Officer to block content under Section 69A of the IT Act in any case of emergency may have potential for misuse. To confer upon one individual appointed by the MIB the power to block content, without providing an opportunity for hearing to the publisher, is excessive and does not provide sufficient procedural safeguards; an issue that had been glossed over by the Supreme Court while upholding the constitutionality of Section 69A and Information Technology (Blocking Rules), 2009, in Shreya Singhal v Union of India.
In Hiralal M. Shah v The Central Board of Film Certification, Bombay, an order of the Joint Secretary to the Government of India directing a Marathi feature film to not be certified for public exhibition was challenged andthe Bombay High Court held that the Joint Secretary was not qualified to judge the effects of the film on the public, nor did he have the experience in examination of films. The High Court observed that allowing a bureaucrat to sit in judgement over the same would make “a mockery of the substantive right of appeal conferred on the producer”. According to the Court, it was difficult to comprehend why an informed decision by an expert body, i.e. the Film Certification Appellate Tribunal constituted under the Cinematograph Act, 1952, was to be replaced with the moral standards of a bureaucrat. A similar mechanism for regulation is being constructed by way of the 2021 Rules.
The three-tier mechanism stipulated by the 2021 Rules also raises the query as to why OTT platforms need to be regulated under the IT Act in the first place. If regulation is required, instead of adverting to the IT Act or the Cinematograph Act, 1952, which regulates traditional media, the regulatory system envisaged under the Cinematograph Act can be emulated to some extent in an alternate legislation solely governing OTT platforms. While the Cinematograph Act may be inadequate in terms of regulating new media, the current IT Rules stretch the boundaries of rule-making power of the Parliament by delving into an area of regulation that is not permissible under the IT Act.
The 2021 Rules are subordinate legislation, and it remains contested whether Part III of the Rules could have been promulgated using the rule-making power conferred on the Central Government under the IT Act. In the case of State of Tamil Nadu v P. Krishnamoorthy, the Supreme Court held that delegated legislation could be challenged if there was failure to conform to the statute under which it was made or if it exceeded the limits of authority conferred by the enabling Act, or if there was manifest arbitrariness or unreasonableness (to an extent where the Court may say that the legislature never intended to give authority to make such rules). With respect to the 2021 Rules, when such broad and arbitrary powers are being conferred on entities which could restrict fundamental rights under Articles 19(1)(a) and 19(1)(g), it should stem from a parent Act that lays down the objective and purpose that drives such regulation. The IT Act only regulates content to the extent of specific offences under Sections 66F, 67, 67A, 67B etc. that are to be judicially assessed, and Section 79 lays down guidelines that must be followed by intermediaries to avail of safe harbour. However, by introducing a distinct class of entities that must adhere to “digital media ethics” and must constitute their own regulation bodies, there is prima facie overreach by the 2021 Rules.
Are the IT Rules Violative of the Constitutional Rights of Free Speech and Expression? The three-tier mechanism under the 2021 Rules may have a chilling effect on creators and producers who may be disincentivized from publishing and distributing content that could potentially be considered offensive to even a small section of society. For example, even in absence of the 2021 Rules, the makers of Tandav agreed to make voluntary cuts and tendered an apology. Similarly, despite the partial stay of the 2021 Rules by the High Courts of Bombay and Madras, OTT platforms have stated that they will play it safe and exercise restraint over potentially controversial content. After the 2021 Rules, criticism that offends the sensibilities of an individual could potentially result in a grievance under Part III, ultimately leading to content being restricted.
In addition to this, the Code of Ethics appended to Part III states that a publisher shall “exercise due caution and discretion” in relation to content featuring the activities, beliefs, practices, or views of any racial or religious group. This higher degree of responsibility, which is ambiguous, may restrict the artistic expression of OTT Platforms. In Shreya Singhal v Union of India, the Supreme Court struck down Section 66A of the IT Act, holding that “where no reasonable standards are laid down to define guilt in a section which creates an offence and where no clear guidance is given to either law abiding citizens or to authorities and courts, a section which creates an offence and which is vague must be struck down as being arbitrary and unreasonable”. By stating that the Constitution did not permit the legislature “to set a net large enough to catch all possible offenders and leave it to the Court to step in and decide who could be held guilty”, the Supreme Court decisively ruled that a law which is vague would be void. Although a breach of the 2021 Rules does not have penal consequences, the Code of Ethics utilises open-ended, broad language whose interpretation could confer excessive discretion on the IDC in deciding what content to remove.
Under India’s constitutional structure, free expression can only be limited to the extent prescribed by Article 19(2), and courts scrutinise any restrictions of expression stringently due to the centrality of free speech and expression to the continued maintenance of constitutional democracy. In S. Rangarajan v P. Jagivan Ram, the Supreme Court observed that the medium of a movie was a legitimate mode to address issues of general concern. Further, the producer had the right to ‘think out’ and project his own message despite the disapproval of others; “it is a part of democratic give-and-take to which no one could complain. The State cannot prevent open discussion and open expression, however hateful to its policies”. The Apex Court further stated that it was the duty of the State to protect the freedom of expression. In K.A. Abbas v Union of India, the Supreme Court upheld the constitutionality of censorship under the Cinematograph Act, but cautioned that the censorship could only be in the interest of society, and that if it ventured beyond this arena, it could be questioned on the ground that a legitimate power was being misused.
In the aforementioned cases, the courts, while upholding censorship guidelines, acknowledged that the same had to be grounded within the four corners of Article 19(2), and the standard for censorship had to be that of an ordinary individual of common sense and prudence, and not that of a hypersensitive individual. However, in recent times, there have been regular outcries against films and web series which may offend the sensitivities of the certain sections of the public. It must be noted that the Government also has a duty to protect the speakers of unpopular opinions, and restrictions on the freedom of speech must only be a last resort when the situations provided for in Article 19(2) (e.g., public order or security of the State) are at stake. Such an approach would help allay the concerns of publishers who may otherwise either resist from creating content that could be potentially controversial or remove or modify scenes.
Conclusion A mechanism that risks the overregulation of content on OTT platforms, as well as grants significant discretion to the Ministry by way of formation of the IDC has the potential to dilute constitutional rights. Further, with India’s burgeoning influence as a producer of cultural content, such a rigid and subjective manner of regulation inhibits artistic expression and may have a chilling effect on the exercise of free speech and expression. Publishing of content on OTT platforms is different from traditional broadcasting in the way that it is made available to the public. Streaming of content on OTT platforms is based on an ‘on-demand’ principle where viewers actively choose the content they wish to consume, and thus it may require specialised regulation. A balanced approach should be adopted for regulation of OTT platforms which adhere to the values embedded in the Constitution as well as guidelines envisioned by the Supreme Court in judgements discussed above.
This blog was written with the support of the Friedrich Naumann Foundation for Freedom.
About the Author: The author is a 2021 graduate of National Law University, Delhi. She is currently working as a Research Associate with the Digital Media Content Regulatory Council.
Editor’s Note: This post is part of the Reflection Series showcasing exceptional student essays from CCG-NLUD’s Seminar Course on Technology & National Security Law. Along with a companion piece by Tejaswita Kharel, the two essays bring to a life a fascinating debate by offering competing responses to the following question:
Do you agree with the Supreme Court’s pronouncement in Anuradha Bhasin that access to the internet is an enabler of other rights, but not a fundamental right in and of itself? Why/why not? Assuming for the sake of argument, that access to the internet is a fundamental right (as held by the Kerala High Court in Faheema Shirin), would the test of reasonableness of restrictions be applied differently, i.e. would this reasoning lead to a different outcome on the constitutionality (or legality) of internet shutdowns?
Both pieces were developed in the spring semester, 2020 and do not reflect an updated knowledge of subsequent factual developments vis-a-vis COVID-19 or the ensuing pandemic.
INTRODUCTION
Although it did little to hold the government accountable for its actions in Kashmir, it would be incorrect to say that the judgment of Anuradha Bhasin v. The Union of India is a complete failure. This reflection paper evaluates the lessons learnt from Anuradha Bhasin and argues in favour of access to the internet as a fundamental right, especially in light of the COVID-19 pandemic.
EXAMINING INDIA’S LEGAL POSITION ON RIGHT TO INTERNET
Perhaps the greatest achievement of the Anuradha Bhasin judgement is the fact that the Government is no longer allowed to pass confidential orders to shut down the internet for a region. Moreover, the reasons behind internet shutdown orders must not only be available for public scrutiny but also be reviewed by a Committee. The Committee will need to scrutinise the reasons for the shutdown and must benchmark it against the proportionality test. This includes evaluating the pursuit of a legitimate aim, exploration of suitable alternatives, and adoption of the least restrictive measure while also making the order available for judicial review. The nature of the restriction, its territorial and temporal scope will be relevant factors to determine whether it is proportionate to the aim sought to be achieved. The court also expanded fundamental rights to extend to the virtual space with the same protections. In this regard, the Court made certain important pronouncements on the right to freedom of speech and expression. These elements will not be discussed here as they fall outside the scope of this paper.
A few months prior in 2019, the Kerala High Court recognised access to the internet as a fundamental right. Its judgement in Faheema Sharin v. State of Kerala, the High Court addressed a host of possible issues that arise with a life online. Specifically, the High Court recognised how the internet extends individual liberty by giving people a choice to access the content of their choice, free from control of the government. The High Court relied on a United Nations General Assembly Resolution to note that the internet “… facilitates vast opportunities for affordable and inclusive education globally, thereby being an important tool to facilitate the promotion of the right to education…” – a fact that has only strengthened in value during the pandemic. The Kerala High Court held that since the Right to Education is an integral part of the right to life and liberty enshrined under Article 21 of the Constitution, access to the internet becomes an inalienable right in and of itself. The High Court also recognised the value of the internet to the freedom of speech and expression to say that the access to the internet is protected under Art. 19(1)(a) of the Constitution and can be restricted on grounds consistent with Art. 19(2).
ARGUING IN FAVOUR OF RIGHT TO INTERNET
In the pandemic, a major reason why some of us have any semblance of freedom and normalcy in our lives is because of the internet. At a time when many aspects of our day to day lives have moved online, including education, healthcare, shopping for essential services, etc. – the fundamental importance of the internet should not even be up for debate. The Government also uses the internet to disseminate essential information. In 2020 it used a contact tracing app (Aarogya Setu) which relied on the internet for its functioning. There also exists a WhatsApp chatbot to give accurate information about the pandemic. The E-Vidya Programme was launched by the Government to allow schools to become digital. In times like this, the internet is not one of the means to access constitutionally guaranteed services, it is the only way (Emphasis Added).
In this context, the right of access to the internet should be read as part of the Right to Life and Liberty under Art. 21. Therefore, internet access should be subject to restrictions only based on procedures established by law. To better understand what shape such restrictions could take, lawmakers and practitioners can seek guidance from another recent addition to the list of rights promised under Art. 21- the right to privacy. The proportionality test was laid down in the Puttaswamy I judgment and reiterated in Puttaswamy II (“Aadhaar Judgement”). In the Aadhar Judgement when describing the proportionality for reasonable restrictions, the Supreme Court stated –
“…a measure restricting a right must, first, serve a legitimate goal (legitimate goal stage); it must, secondly, be a suitable means of furthering this goal (suitability or rational connection stage); thirdly, there must not be any less restrictive but equally effective alternative (necessity stage); and fourthly, the measure must not have a disproportionate impact on the right-holder (balancing stage).” –
This excerpt from Puttaswamy II provides as a defined view on the proportionality test upheld by the court in Anuradha Bhasin. This means that before passing an order to shut down the internet the appropriate authority must assess whether the order aims to meet a goal which is of sufficient importance to override a constitutionally protected right. More specifically, does the goal fall under the category of reasonable restrictions as provided for in the Constitution. Next, there must be a rational connection between this goal and the means of achieving it. The appropriate authority must ensure that an alternative method cannot achieve this goal with just as much effectiveness. The authority must ensure that the method being employed is the least restrictive. Lastly, the internet shutdown must not have a disproportionate impact on the right holder i.e. the citizen, whose right to freedom of expression or right to health is being affected by the shutdown. These reasons must be put down in writing and be subject to judicial review.
Based on the judgment in Faheema Sharin, an argument can be made how the pandemic has further highlighted the importance of access to the internet, not created it. The reliance of the Government on becoming digital with e-governance and digital payment platforms shows an intention to herald the country in a world that has more online presence than ever before.
CONCLUSION
People who are without access to the internet right now* – people in Kashmir, who have access to only 2G internet on mobile phones, or those who do not have the socio-economic and educational means to access the internet – are suffering. Not only are they being denied access to education, the lack of access to updated information about a disease about which we are still learning could prove fatal. Given the importance of the internet at this time of crisis, and for the approaching future, where people would want to avoid being in crowded classrooms, marketplaces, or hospitals- access to the internet should be regarded as a fundamental right.
This is not to say that the Court’s recognition of this right can herald India into a new world. The recognition of the right to access the internet will only be a welcome first step towards bringing the country into the digital era. The right to access the internet should also be made a socio-economic right. Which, if implemented robustly, will have far reaching consequences such as ease of social mobility, increased innovation, and fostering of greater creativity.
*Views expressed in the blog are personal and should not be attributed to the institution.