2024 will be a crucial year as India is in the process of reforming its content regulation. Europe’s Digital Services Act is under implementation stage, and as we proceed in India, two factors are of importance.

First, intermediary liability exemptions must be maintained. They prevent private companies from deciding the interpretation of legal provisions, thus avoiding the definition of limits to freedom of expression. They also encourage tech companies to adopt more precise internal content moderation policies to target undesirable content.

The DSA, adopted in 2022, is the new horizontal legislation establishing a general framework for providing online services within the EU. Despite all the novelties regarding obligations from platforms vis-à-vis illegal and harmful content, the DSA does not repeal the principle that intermediaries are immunised from liability for content they do not know about.

Illegal content

DSA also holds online platforms responsible for fulfilling important obligations vis-à-vis the presence of illegal content online via mechanisms such as notice-and-action, orders to provide information to competent administrative and judicial authorities, and orders to act against illegal content.

It does not define specific categories of unlawful speech online since the existing rules in the offline environment already cover this. It incorporates new rights for users and obligations for service providers in terms and conditions, transparency requirements, statements of reasons in cases of content removals, complaint handling systems, and out-of-court dispute settlements, among others. Safe harbour, a foundational principle of internet regulation, must be seen as one of the guardrails that allow platforms to properly moderate content while at the same time protecting fundamental rights and avoiding excessive or arbitrary decisions.

Secondly, the DSA also gives platforms a relevant role (and legal accountability) vis-à-vis adjudicating on user harm against revenge porn, cyber-flashing, dark web, or dissemination of disinformation. Caution is needed when laws impose due diligence obligations on platforms regarding content that generates potential harm, though not necessarily illegal. In such cases, risks for overboard restrictions to freedom of expression are incredibly high.

Substantial online platforms (entities with more than 45 million monthly users) have special and particularly relevant obligations (which will not be monitored by national regulators but by the European Commission itself) in the DSA.

Apart from a controversial crisis response mechanism for extraordinary circumstances leading to a severe threat to public security or public health, massive platforms are obligated to identify, analyse, assess, and adopt the necessary measures to mitigate any systemic risks stemming from the design, functioning, and use made of their services.

These systemic risks are not only associated with the dissemination of illegal content or causing adverse effects on human rights but also with other and much broader categories such as civic discourse, public security, gender-based violence, public health, minors, or users’ physical and mental well-being.

However the DSA also incorporates a series of criteria and guardrails to guarantee effectiveness in risk assessment:

Definition by law of possible measures.

Due consideration of fundamental rights.

Independent audits.

Public-private collaboration via co-regulatory mechanisms and regulatory cooperation between private companies and regulatory instances.

Therefore, any approach emulating these ‘systemic’ obligations should carefully consider their novelty to avoid unintended harmful outcomes.

Rizvi is Founding Editor, The Dialogue. Barata is Senior Legal Fellow, Justicia, and Intermediary Liability Fellow, Stanford University

comment COMMENT NOW