We often come across messages flashing “limited” and “exclusive” on our e-commerce or online streaming applications. Many of us may have signed up for that trial streaming service or subscription, only to be automatically charged when the trial expires.

There are ads where the exit option “X” is so small that we always end up accidentally clicking the ad instead of exiting. Regulators world over are taking cognisance of these manipulative design practices commonly referred to as “dark patterns.” By exploiting consumer biases, such designs seek to mislead consumers into making unintended choices. But how harmful are these design choices and to what extent can legislative interventions protect consumers from such harms?

Recently, the Department of Consumer Affairs, released the draft “Guidelines for Prevention and Regulation of Dark Patterns, 2023” (“DP Guidelines”) issued under the Consumer Protection Act, 2019 (“CP Act”) prohibiting platforms from engaging in any “dark patterns.” These guidelines identify specific practices as “dark patterns” liable to be prosecuted under the CP Act. This includes practices that create a false sense of urgency or scarcity misleading consumers to make an immediate purchase, complicated process for cancelling subscription services, and masking advertisements as user generated content.

Global regulation

In the European Union (EU), the Digital Services Act also prohibit online interface design choices that distort or impair a consumer’s ability to make free and informed decision. Guidelines have also been issued under the EU General Data Protection Regulation on dark patterns in social media platforms.

In the USA, some state privacy laws prohibit dark patterns. Further, the Federal Trade Commission has been actively pursuing actions against online platforms that indulge in such practices.

Online choice architecture can be used to hide important information that can inform consumer choices, set default choices without accounting for consumer choices and redirect attention to specific products/services. Besides distorting consumer behaviour, it can also be used to exploit market power. This is concerning if an entity has market power because it can use such choice architectures to leverage their market power. As India seeks to prohibit dark patterns under the CP Act, there are important issues that must be considered.

First, while encountering a dark pattern, many consumers may either be not aware of the presence of such practices or even if they are aware they may tend to assume that it is safe. Therefore, regulation of such dark patterns must be complemented with not only consumer awareness initiatives but also designing tools for consumers to detect such practices.

Second, the Central Consumer Protection Authority must consider engaging with experts in behavioural and data sciences to identify how such online consent architectures work, draw patterns, investigate harms and also design potential remedies. This will be particularly relevant for pursuing any action under the DP Guidelines.

Third, to the extent that the Digital Personal Data Protection Act, 2023 (“DPDP Act”) regulates “consent” and the rights of the data principals, one can argue that the DPDP Act also vests consumers with information-based rights. To that extent it is useful to engage in inter-Ministerial consultations to leverage these frameworks for protecting consumers and avoiding any regulatory overlaps.

Fourth, while the DP Guidelines expressly identify “Specific Dark Patterns” that will be considered as a violation of the guidelines, it also leaves room for other practices to qualify as “dark pattern”. Businesses will have to relook into their interface design to ensure that their UI/UX design complies with these guidelines.

Dark pattern regulation will inevitably have an overlap between consumer and data protection, and digital economy laws.

So inter-regulatory coordination will be critical.

The writer is the Fintech Lead at the Vidhi Centre for Legal Policy. Views expressed are personal