The new IT rules in their current form do not conform with the international human rights norms, according to UN experts.

A group of United Nations special rapporteurs in a recent communication to the Government of India, as viewed by BusinessLine , urged the Centre to review specific provisions in the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 that “do not appear to meet the requirements of international law and standards related to the rights to privacy and to freedom of opinion and expression.”

The provisions may hinder the right to privacy and freedom of opinion and expression as protected by Articles 17 and 19 of the International Covenant on Civil and Political Rights, acceded to by India on 10 April 1979, the experts have said.

“We are concerned that the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, in their current form, do not conform with international human rights norms,” the experts said in their report.

“As a global leader in technology innovation, India has the potential to develop a legislation that can place it at the forefront of efforts to protect digital rights. However, the substantially broadened scope of the Rules is likely to do just the opposite,” they said.

“We would therefore encourage the Government to take all necessary steps to carry out a detailed review of the Rules and to consult with all relevant stakeholders, including civil society dealing with human rights, freedom of expression, privacy rights and digital rights,” they added.

Grounds of restrictions too broad

In their report, the experts argued that the grounds of restrictions on users and content as provided in the rules are too broad.

The report cited examples from Section (3)(1)(b) of the Rules which requires social media intermediaries to perform due diligence for content that is "racially or ethnically objectionable", "harmful to child", "impersonates another person", "threatens the unity... of India," "is patently false and untrue", "is written or published with the intent to mislead or harass a person [...] to cause any injury to any person.”

According to experts, these terms are “overly broad” and lack sufficiently clear definitions. This may lead to and may lead to arbitrary application.

“As social media intermediaries deal with a huge amount of content, a rigorous definition of the restriction of freedom of expression is critical for them to protect speeches that are legitimate under international law, such as the expression of dissenting views,” they said.

The experts further expressed concerns about the impact that such “expansive definitions” could have on the independence of India’s digital news media. The interpretation of these terms could be used to “unduly restrict the exchange of ideas and information online,” as per the experts.

Role of companies

The report also expressed concerns over the impact that the obligations on companies to monitor and rapidly remove user content could have on the right to freedom of expression. To meet these obligations and limit liability, companies may “over comply” with takedown requests, as per the report. Companies may also develop digital or automated content moderation and removal system, which may lead to illegitimate censorship.

Such systems, as witnessed in the past may be inaccurate of evaluating cultural contexts and identifying illegitimate content.

“We are worried that the short deadlines, coupled with the aforementioned criminal penalties, could lead service providers to remove legitimate expression as a precaution to avoid sanctions,” the experts said.

“Private companies have a responsibility to respect human rights, per the UN Guiding Principles on Business and Human Rights. However, the outsourcing of content moderation by requiring private companies to remove broad categories of content, without an order of a court or independent administrative authority, may run contrary to international standards on freedom of expression,” they added.

The report also highlighted the recent incident where the Ministry of Electronics and Information Technology directed Twitter to shut down over 1,000 accounts under Section 69A of the Information Technology Act, on 31 January 2021, on the grounds of these accounts spreading misinformation.

The special rapporteurs also expressed “serious concerns” over individual employees being subjected to criminal liability under the new Rules as less punitive measures are available.

“The severity of the envisaged penalties incentivizes the restriction of content and is likely to have a chilling effect on freedom of expression,” the report said.

Right to privacy

The report further highlighted provisions under Section 4 of the Rules which require social media intermediaries to identify the first originator of a particular message “in the context of the prevention, detection, investigation, prosecution or punishment of an offence.”

“We are seriously concerned that Section 4 may compromise the right to privacy of every Internet user,” the experts said, noting it may allow authorities to access to user data and restriction of content without any judicial oversight mechanism, which in turn can generate further human rights violations.

Furthermore, it may also lead to a climate of self-censorship.

The experts also expressed concerns over these provisions in effect being applied to potentially to any intermediary.

The report has been signed by Irene Khan, Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression, Clement Nyaletsossi Voule, who is the Special Rapporteur on the rights to freedom of peaceful assembly and of association and Joseph Cannataci, the Special Rapporteur on the right to privacy.

comment COMMENT NOW