Here we will provide you with information about everything that has happened and is happening with the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021. We provide you a quick rundown on its contentious history, the need for starting afresh on this vital conversation about platform accountability and also an analysis of the present draft of the rules. Given that it seeks to now even regulate digital news platforms and OTT content providers, we recommend a fresh consultation and greater transparency. Several proposals in their present form suffer from unconstitutionality and will undermine free expression and privacy for internet users in India.
A controversial past
Just to be clear, we believe that social media is broken. The large platforms owe us accountability and user rights. But a recent draft government proposal we have accessed makes this problem worse. It will increase censorship and reduce user privacy. Before we get into the specifics of the proposal, let us just do a quick recap of the history of this issue.
On December 24, 2018, the Indian Express first broke the news about a “confidential” meeting in which proposed amendments to the rules under Section 79 of the Information Technology Act, 2000 (IT Act) were discussed. Section 79 of the IT Act provides what is called a ‘safe harbour’ to intermediaries who host user generated content and exempts them from liability for the actions of users on their platform if they adhere to guidelines prescribed by the government. The existing guidelines that intermediaries are required to comply with are present under the Information Technology (Intermediaries Guidelines) Rules, 2011 but the government has been contemplating changes to these guidelines since December 2018.
After we learnt about the private meeting between government officials and company representatives regarding amendments to the guidelines for intermediary liability, we made available a complete copy of the draft rules because there was substantial public interest at stake. While, at first, the Ministry of Electronics and Information Technology (MeitY) denied any knowledge of any such meeting and the proposed changes, they subsequently acknowledged them and initiated a public consultation.
Shortly after this, MeitY announced a public consultation on the proposed changes. These were proposals to make changes to the liability exemptions for online platforms or service providers. An intermediary is any service provider which transmits, hosts and publishes user content without exercising editorial control over the content like traditional publishers do. So it can be your ISP, email provider, social media platform, or any web-service which allows you to post, upload and publish. These platforms because of their technical ability to connect us to others, host our data, publish it, help us realise our freedom of speech, they mediate our online experience, impact our privacy and hold great power.
We feared this exercise was expending energy and resources on the basis of draft legal rules that lack any clear rationale that has been articulated by MeitY. The proposals themselves were substantively harmful to our fundamental rights to freedom of speech and privacy as we explained in our submissions. We instead advocated for a fresh start and urged the government to first publish a white paper which clearly identified the problems it seeks to solve through the proposed amendments.
One of the contentious provisions which had been proposed by MeitY in the December 2018 version and which also features in the latest 2021 version of the draft Rules relates to traceability of originators of information on the internet, The issue of traceability of originators of information on messaging platforms is also the subject of litigation before the Supreme Court in Antony Clement Rubin v. Union of India (T.C. Civil No.189 of 2020). This case originated as a PIL before the Madras High Court seeking linking of Aadhaar with social media accounts. However, during the course of hearings before the Madras High Court, the focus shifted to traceability of originators of information on end to end encrypted platforms such as WhatsApp, and the case was later transferred to the Supreme Court.
IFF is an intervenor in this case and we have filed an independent expert report authored by Dr. Manoj Prabhakaran, Professor of Computer Science and Engineering at IIT Bombay, which explains the long term ineffectiveness of proposals to facilitate traceability on end to end encrypted platforms. Prof. Prabhakaran’s report is in addition to our written submissions which highlight that any attempts to weaken encryption through backdoors or key escrow systems would undermine the privacy and security of all users because these vulnerabilities could be exploited by criminals as well.
Draft Information Technology (Guidelines For Intermediaries And Digital Media Ethics Code) Rules, 2021
Recently, there were multiple disquieting media reports (see here and here) about possible legislation to regulate OTT platforms which stream curated content such as Netflix, Amazon Prime, Hotstar etc. This is especially concerning after a recent controversy over the web series Tandav (see here and here), which is the latest in a long line of incidents in which there was public outcry over a perceived slight to religious beliefs. Acting on this the Information and Broadcasting Union Minister Prakash Javadekar has recently stated that guidelines for OTT platforms are being considered by the Government.
It seems that from December, 2018 to February, 2021 the Draft Intermediaries Rules, 2018 have now transformed into the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021.
Acting in public interest, just like we had released a prior December 2018 version of these Rules and a proposal to amend Section 66A, we are today releasing a copy of the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 to the public and calling for a public consultation since these amendments will impact the fundamental rights of all Indian internet users.
‘Regulation’ of Video Streaming Platforms
The first of the three big changes we are highlighting is the regulation of OTT and news media platforms through proxy. We find this grossly unconstitutional. The draft IT Rules, 2021 have, in the Appendix to the Rules defined a “Code Of Ethics And Procedure And Safeguards In Relation To Digital/Online Media” which shall apply on “applicable entities”. “Applicable entities” as contained in Rule 7 include, “publishers of news and current affairs content”; and “intermediaries which primarily enable the transmission of news and current affairs content”; and “publishers of online curated content”; and, “intermediaries which primarily enable the transmission of online curated content.”
Further, under sub-rule (3) of Rule 8, the Rules lay down a three-tier structure for observance and adherence to the Code which is as follows, “Level I – Self-regulation by the applicable entity”; “Level II – Self-regulation by the self-regulating bodies of the applicable entities”; “Level III – Oversight mechanism by the Central Government.” What does this mean? In all likelihood, government oversight and more censorship.
This oversight mechanism is being created without any clear legislative backing and will now increasingly perform functions similar to those played by the Ministry of Information and Broadcasting for TV regulation. For instance, as per Rule 13(4) this also now includes powers of censorship such as apology scrolls, but also blocking of content! All of this is being planned to be done without any legislative backing or a clear law made by parliament.
Any potential legislation regarding the OTT video streaming sector in India may cause immense harm to individual rights as well as the national interest. Today, India is no longer a consumer but a producer of original high quality video content that provides employment and entertainment to audiences locally and globally. It competes actively with other countries such as South Korea and needs an environment that recognises that traditional cinema or television based regulation may irreparably harm the sector. Any such model of regulation will likely have a substantial impact on citizens’ digital rights, result in economic harm, and also negatively impact India’s growing cultural influence through the production of modern and contemporary video formats entertainment. We have written about more potential harms here.
On February 12, 2021, the Mint also published a report titled, “17 OTT platforms release self-regulation toolkit”, which stated that 17 of India’s largest streaming services have launched a self-regulation toolkit, under the aegis of the Internet and Mobile Association of India (IAMAI), pre-empting government plans to bring out a code for all over-the-top (OTT) services. This toolkit builds upon the Self-Regulation Code of Online Curated Content Providers adopted by 15 Indian streaming services and digital companies. The Code has several issues itself, including providing a significant impetus towards self-censorship. More importantly, the provisions of the code relating to self-regulation may harmonise with the mechanisms laid down in clauses 11, 12, and 13, resulting in both a push towards self-censorship as well as a sizeable amount of government oversight.
Digital News Media
With respect to regulation of news media, several concerns abound. The purview of the Information Technology Act, 2000 does not extend to news media, and so the guidelines do not have the legislative backing to regulate news media. Thus, these Rules are exercising powers far beyond the parent legislation. Additionally, these guidelines are ultra vires section 79 of the IT Act, which provides a ‘safe harbour’ through an exemption from liabilities in certain cases. This makes these guidelines a camouflaged way to indirectly regulate online news media by bringing these platforms under the aegis of the Information Technology Act, 2000 instead of following the due process of parliamentary scrutiny and subsequent legislation.
The vague definition of “publisher of news and current affairs content” may also lead to further arbitrariness. The definition excludes replica e-papers of newspapers from its ambit. Now, will this mean media publications such as the Caravan magazine, which physically publishes a monthly magazine and was recently at the centre of a social media storm involving blockages of its Twitter accounts, falls under the category of ‘publishers of news and current affairs content’? This vagueness in definition would allow arbitrariness by the government in the exercise of significant discretionary powers and would allow them to censor media at their pleasure. Such a definition also privileges the established media houses, who may have a print newspaper as a significant component of their operations and could thus claim to be exempted from these guidelines. Smaller and independent media houses on the other hand may not have the luxury to do so, and instead rely on the internet to disseminate news and information. This discriminatory approach between online news media and traditional print newspapers and legacy media will place additional responsibilities on the former on the basis of an ambiguous definition that lends itself to confusion and uncertainty, and thus potentially stifle independent online news media.
Moreover, it is unclear what is the purpose of requiring significant publishers of news and current affairs content to notify the Broadcast Seva established by the government. For instance, the Press and Registration of Books Act, 1867 required registration because it was aimed at preservation of copies of books and periodicals containing news printed in India. While this may have been necessary in 1867, such requirements have outlived their archival utility in the modern digital age with the internet maintaining a record of news and current affairs. Therefore, there is a need to fundamentally re-examine the purpose and scope of the notification requirement present under Rule 16 of the draft IT Rules, 2021. It should also be noted that while the government earlier planned to bring about these changes through a legislation which would have to be approved by the Parliament, it may now go ahead and implement it through mere executive fiat and bypass legislative debate on this issue.
‘Traceability’ and breaking encryption?
The draft IT Rules, 2021 have, under sub-rule (2) of Rule 5, made it mandatory for a significant social media intermediary providing services primarily in the nature of messaging, such as Whatsapp, Signal, Telegram etc., to enable the identification of the first originator of the information. This introduces the requirement of traceability which would break end-to-end encryption. It should be noted that previous proposals which seek to implement traceability in a manner which is compatible with end to end encryption have been shown to be vulnerable to spoofing where bad actors can falsely modify the originator information to frame an innocent person. Further, the originator of the message has no control over who forwards the content, or how many times it is forwarded, or in which fora.
Many platforms (Whatsapp, Signal, Telegram but even other platforms) retain minimal user data for electronic information exchange and also deploy end-to-end encryption to provide reliability, security and privacy to users. These are used by millions of Indians to prevent identity theft, code injection attacks and many other threats. Encryption becomes even more important now as more of our lives involve our personal data being aggregated and analysed at a scale that was never possible before. In the past, the Report of the Justice Srikrishna Committee on Data Protection has also criticised the government for mandating low encryption standards in license agreements with telecom service providers because “this poses a threat to safety and security of the personal data of data principals.”
Without thought or involving technical experts in an open consultative process, without any data protection law or surveillance reform, this is being tinkered with by introducing the requirement of “traceability”. To understand how encryption technologies defend and expand human freedom, we urge you to watch this Independence Day lecture by Dr. Debayan Gupta, Assistant Professor of Computer Science at Ashoka University.
This introduction of a traceability requirement has important consequences for everyday users of online services and should also be seen in the context of the MHA notification which activates the IT (Procedure and Safeguards for Interception, Monitoring and Decryption of Information) Rules, 2009 that hold the power to direct, “decryption”. We do not have any proper parliamentary oversight or judicial check on surveillance, and the latest draft rules, if they go through, would be a tremendous expansion in the power of the government over ordinary citizens eerily reminiscent of China’s blocking and breaking of user encryption to surveil its citizens.
In so far as Rule 5(2) requires modifications to the design of encrypted platforms to enable traceability, it is beyond the scope of the parent provision that is Section 79 of the IT Act. Here, it is important to note that the power to prescribe encryption standards and methods originates from Section 84A of the IT Act, and not Section 79, which is a safe harbour provision. Even Section 84A empowers the Central Government to prescribe modes and methods of encryption for the purpose of “secure use of the electronic medium and for promotion of e-governance and e-commerce.” Any attempts to weaken encryption to enable traceability would not satisfy the objective of ensuring secure use of the electronic medium.
Development of AI to automate censorship?
Further, under sub-rule (4) of Rule 5, the draft IT Rules made it mandatory for a significant social media intermediary (such as WhatsApp, Signal, Twitter, Instagram or Facebook etc.) to deploy technology-based measures, including automated tools or other mechanisms to proactively identify information that depicts any act or simulation in any form depicting rape, child sexual abuse or conduct, whether explicit or implicit.
While these automated tools are presently proposed to be deployed in respect of highly objectionable content, such as depictions of rape or child sexual abuse, such a measure is nevertheless alarming. Once technical changes are implemented in systems very quickly they suffer from a function creep. This will be a slippery slope where use of automated tools will be expanded beyond instances of sexual violence and child sexual abuse material. For instance, in 2019, the Central Bureau of Investigation asked social media companies to use PhotoDNA, a technological tool meant to identify child sexual abuse material, to track suspects in regular criminal cases. Moreover if one reads Rule 5(4) closely, it also goes beyond to a re-upload requirement, when it states that, “or any information which is exactly identical in content to information that has previously been removed or access to which has been disabled on the computer resource of such intermediary under clause (d) of sub-rule (1) of rule 4.”
Development of artificial intelligence (AI) tools of censorship is replete with a host of risks, including the underdeveloped and imperfect nature of AI in the current state-of-the-art. AI “learns” by examining vast amounts of data, and the development of a censorship AI is likely to require social media intermediaries to store and examine large amounts of user-generated content that does not in any way relate to the kind of content sought to be censored.
Additionally, coding biases in the development of AI often lead to discrimination, overbreadth and a lack of accountability and transparency. This is of particular concern since the AI seeks to control and monitor the exercise of a user’s fundamental right to freedom of speech and expression. It is necessary to carefully consider whether AI ought to be allowed to regulate the fundamental rights of citizens.
SOS! We urgently need a public consultation!
The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 have the potential to fundamentally change how the internet is accessed and used by millions of users across India. Many of the changes proposed continue with greater finesse, what we had first in 2018 termed as a, “Chinese model of online surveillance and censorship”. We fear that despite material changes made to the draft rules, and some noticeable improvements in greater precision — the core and the substance of these concerns remain the same, further some new areas of regulation have been added without public deliberation.
In some respects it has even become worse, with an illegal and unconstitutional proposed extension of executive power to regulate online news media and video streaming platforms. This makes a fresh consultation, commencing with a white paper an urgent necessity. The Internet Freedom Foundation believes in a clear need for regulation that is centred within our constitutionality and advances fundamental rights. This helps bring much needed oversight and accountability to big tech. However, the present approach, not only fails to adopt this in letter in spirit but has the potential of chilling our voices online and also hurting our privacy.
- Draft Information Technology (Guidelines For Intermediaries And Digital Media Ethics Code) Rules, 2021 (link).
- Submission to the Ministry of Electronics and Information Technology on the Draft of The Information Technology [Intermediary Guidelines (Amendment) Rules] 2018 dated January 30, 2019 (link)
- IFF files independent expert’s submission before Madras HC on PIL relating to encryption and traceability dated August 23, 2019 (link)
- Government censorship and the dire need for transparency dated February 8, 2021 (link)
- Say no to overregulation of digital news media dated December 23, 2019 (link)
- #LetUsChill: Confusion and Opacity over OTT guidelines dated February 4, 2021 (link)
This article first appeared on the Internet Freedom Foundation’s official website. Read the original here.