On Saturday, 11 April, SFLC.in organised a stakeholder consultation on the second draft amendments to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (“Draft Amendments”). The consultation brought together legal experts, journalists, civil society members, and platform representatives to examine the proposed changes.
Impacts on Intermediary Liability –
- Regulatory Trajectory Since the IT Rules, 2021: It was argued that the most significant shift in power had already taken place with the IT Rules, 2021. Those rules moved away from the standard set in Shreya Singhal v Union of India by allowing a wider range of takedown requests, including from users, which increased regulatory pressure on platforms. The present Draft Amendments were seen as continuing this trajectory and further strengthening an already expanding system of control over online speech, rather than introducing a new framework for platform governance.
- Expansion of Executive Authority Through Rule-Making: A key concern raised was that the Draft Amendments significantly expand executive power. Requiring platforms to comply with advisories and guidelines effectively turns non-binding executive directions into binding legal obligations. This enables the Executive to bypass Parliament, since such instruments do not go through the legislative debate required under Section 87(3) of the IT Act, 2000, or any sort of public consultation. It was also observed that the expanded role of the Inter-Departmental Committee (IDC), particularly its ability to take up broad “matters,” concentrates decision-making authority within the Executive, raising concerns about the erosion of institutional checks and balances.
- Safe Harbour Being Made Conditional: It was noted that safe harbour has, in some cases, been used by platforms to avoid accountability and by bad actors to spread harms such as misinformation, hate speech, and scams. At the same time, it was emphasised that weakening safe harbour is not an appropriate response. The principle of safe harbour is essential for protecting free speech online, and it should be seen as a statutory provision, and not as a concession given to platforms by the government. In the Indian context, the loss of safe harbour carries the additional risk of criminal liability, owing to the many provisions that penalise certain types of speech, and this creates strong incentives for platforms to over-comply with takedown requests – through mechanisms like automated content filters and preventive takedown and censorship.
- Disproportionate Compliance Burdens: Concerns were raised about the disproportionate impact on smaller and non-profit platforms, which may lack the resources to comply with short timelines for takedown requests. It was noted that such platforms often undertake careful review of content and even the takedown notices received to verify credibility, which may not be feasible within compressed timeframes. Knowledge platforms that rely on user contributions were identified as particularly vulnerable, as contributors may withdraw participation due to increased legal risks, affecting the availability of reliable information.
Impacts on Content Takedown –
- Incentives for Over-Removal: It was observed that platforms are already under considerable pressure to comply with takedown requests, often without adequate time for review. The Draft Amendments are likely to intensify this pressure, leading to “collateral censorship,” where more speech is removed than necessary to minimise legal risk and to ensure compliance with the legal mandate. It was also noted that users may become more cautious about what they post, particularly on political or sensitive issues. Concerns were also raised about the possibility of selective or targeted censorship in such an opaque system, a pattern we already observe under existing rules.
- Centralised Content Takedown System: The discussion highlighted the emergence of an expanding “censorship infrastructure” built through successive amendments like the Sahyog Rules and the SGI Rules. It was observed that these changes reduce regulatory friction, making it easier and faster for the government to enforce takedowns at scale. At the same time, concerns were raised about opacity, with users often unaware of what content has been removed, who ordered it, under what law, or how such decisions can be challenged. For almost all recent takedowns, the only justification given has been that of a vague “legal demand”.
Impacts on Free Speech, Journalism and Users –
- Code of Ethics on Ordinary Users: A major concern was the blurring of the distinction between users and publishers. By extending the Code of Ethics to non-publisher users, the Draft Amendments potentially bring ordinary online activity within regulatory scope. Given the broad definition of “news and current affairs,” even sharing or commenting on news could attract scrutiny. It was suggested that this may discourage participation in public discourse, particularly in the absence of clarity on whether actions such as reposting or citing news content could lead to liability. There is also a lack of clarity on what actions constitute sharing of “news and current affairs content”, and questions arose as to whether mere reliance on a news report or a citation would also attract this Rule.
- Implications for Journalism: It was observed that the news ecosystem is already subject to significant regulation, and extending similar obligations to users may restrict the circulation of information. At the same time, platforms themselves influence public discourse through algorithmic curation and amplification. When combined with increased regulatory pressure, this may further shape what information is visible to users. While acknowledging that harms such as misinformation and hate speech are real, it was argued that the IT Rules are not the appropriate tool to address these issues, and that regulation should instead be undertaken through proper legislation. In fact, with public interest tools like the Community Notes feature on X (formerly Twitter) possibly being affected by the Draft Amendments, it is likely that misinformation on the platform will increase in scale.
Wider Policy Implications –
- Use of Delegated Legislation for Substantive Policy Changes: It was argued that changes of this scale should not be introduced through delegated legislation or executive directions, but through laws enacted by Parliament. The use of advisories and guidelines, which do not require public consultation, was seen as undermining democratic processes, especially when compliance with such instruments is effectively made mandatory.
- Limits of Judicial Oversight: Concerns were expressed about whether judicial oversight can keep pace with frequent amendments. Legal challenges take time, and new rules may be introduced before a verdict is reached. For example, the Fact Check Unit Amendments, 2023 which attempted to do very similar things as the Draft Amendments, were struck down by the Bombay High Court, and is currently pending adjudication before the Supreme Court. This raises the possibility these if these Draft Amendments come into force, they may become entrenched before they are meaningfully reviewed by the courts.
- Potential Influence on other Jurisdictions: It was observed that India’s regulatory approach may influence other jurisdictions, particularly in the Global South. There is a risk that similar frameworks could be adopted elsewhere, leading to broader restrictions on online speech globally.
