Dark Patterns – Beyond Consumer Protection Law

by Srija Naskar

Introduction

On 30th of November, 2023, the Central Consumer Protection Authority (“CCPA”) set up under the Consumer Protection Act, 2019 (“Consumer Protection Act”) notified the Guidelines for Prevention and Regulation of Dark Patterns, 2023 (Guidelines). The Guidelines seek to prevent all platforms, systematically offering goods or services in India, advertisers, sellers from engaging in any “dark pattern practices.”

Dark patterns can be understood as user interface designs that benefit an online service provider by tricking, coercing, manipulating, or deceiving users into making unintended and potentially harmful decisions.

Online service providers have become increasingly sophisticated in deceiving users by resorting to a bundle of privacy dark strategies. This includes strategies such as excessive data collection and storage, denying data subjects control over their data, making it hard or even impossible for data subjects to learn how their personal data is collected, stored, and processed, manipulating consent etc. Thus, causing grave privacy harm to the users. 

Dark patterns and their consequent harms can be of various other kinds. For example, consumers can sign up for a service such as Amazon Prime through a single click. However, to cancel or unsubscribe from it consumers are faced with many confusing steps. Sometimes, they  are  redirected to multiple pages that attempt to persuade them to continue their subscriptions by presenting several offers of discounted pricing. Only after clicking through such pages are consumers able to finally cancel the service. In essence, the consumer here has run into a certain kind of dark pattern known as misdirection. This dark pattern can potentially lead to an economic harm if the consumer ends up purchasing the unwanted subscription which they originally did not intend to buy.

For the purposes of this blog, I will focus on dark patterns which primarily impact the privacy of an individual.

This blog will discuss the impact of privacy dark strategies and the design tricks used by platforms to manipulate users, thus arguing the need for dark patterns to be regulated more holistically. Consumer protection laws may offer a promising way forward to target general dark patterns that find their way into every day online transactions. However, I argue that it must be in tandem with legislations such the Digital Personal Data Protection Act, 2023 (“DPDP Act”) and the upcoming Digital India Act, 2023 (“DIA”) to target privacy dark patterns effectively. Synergies between different areas of laws such as consumer protection and data protection can help ensure adequate protection from various kinds of harms posed by dark patterns. The blog concludes with certain recommendations on the aspects that should feature within these laws. 

Privacy Dark Patterns

  1. Maximisation of Data 

The starting page of Tripadvisor mobile app, a review platform for travel-related content, asks the user to log in with a personal Google+, Facebook, or email account. Further, there is a fourth option that offers the creation of a Tripadvisor account. Interestingly,  there is a “Skip” button as well, which skips the login process entirely but is hidden in the upper right corner of the page. When signing in with Facebook, for example, Tripadvisor wants to gain access to the friend list, photos, likes, and other information. In essence, the consumer here has run into a privacy dark strategy commonly adopted by online service providers that focuses on maximising data collection and storage; where consumers are coerced into disclosing personal information that is not needed for the functionality of the service. 

  1. Coercive Consent

Online service providers are increasingly attempting to hide privacy dark ingredients into the terms and conditions of using the service. The terms and conditions are known to be long and written in complicated legal jargon to ensure that it is not user friendly. The inability of the user to grasp the legal jargon puts him in a vulnerable state, since the policy is legally binding. Research shows that individuals give consent to such terms and conditions without reading privacy policies as a result of this complexity, making it difficult for users to learn about what happens to their personal data. For example, the British firm GameStation revealed that it legally owns the souls of thousands of customers, due to an “immortal soul clause” that was secretly added to the online terms and conditions for the official GameStation website, as an April Fool’s gag. This clause was added as a part of an attempt to highlight how few customers read the terms and conditions before consenting online. The gag reveals the effectiveness of this dark pattern and shows that companies can hide everything in their online terms and conditions.

  1. Cancellation trickery

Several service providers have unnecessarily complicated the process of deleting accounts either by not providing any account deletion option at all or by simply making the user interface deliberately inconvenient. In essence, if the users are inevitably required to call customer support, the process automatically becomes cumbersome, thus increasing barriers to deleting the account. Such deliberately inconvenient user experiences may push the user towards a dark pattern where the user is forced to reconsider the actual deletion decision.

Privacy dark patterns work well primarily due to 1) the advantage it takes of the psychological tendencies of human beings; and 2) the design tricks adopted by online platforms. Studies have shown that whenever humans have little motivation or opportunity to think and reason because they lack required knowledge, ability or time, they fail to read the terms and conditions carefully. Consequently, users agree to them quickly without weighing the pros and cons. This is supplemented by the design tricks which rely on minimum transparency and maximum complexity. Studies on the power of design have long recognised that the design of built environments constrains human behaviour and the same is true online. In simple terms, users can only click on the buttons or select the options presented to them; or can only opt-out of the options from which a website allows them to opt-out. Essentially, these hidden design choices give people the illusion of free choice. 

Dark patterns should be regulated beyond a consumer protection perspective

While the prescription of specified dark patterns in the Guidelines is helpful in providing guidance, the scope of the Guidelines remains limited to deceptive and unfair practices and excludes manipulation. Also, the Guidelines are straight-jacketed and lack a graded approach towards the varied effects that dark patterns can have on an individual. Therefore, privacy dark patterns will be better regulated in tandem with the DPDP Act which already comprises provisions surrounding data protection, data retention, consent, erasure of data etc. For example, although dark patterns in the United States (“USA”) have been heavily regulated by the Federal Trade Commission (“FTC”), the first legislation – the California Privacy Rights Act – explicitly regulating dark patterns in the USA has come into place. The Act aims to take a leadership role in regulating dark patterns generally and privacy dark patterns specifically. 

Recommendations

1. The DPDP Act requires data fiduciaries to provide notice to the data principals at the time of requesting consent. The notice must inform the data principal about the personal data and the purpose for which the data will be processed. While this mechanism might ensure that user consent is informed, free, and capable of being withdrawn, it does not specifically tackle dark patterns. The Act still leaves scope for the data fiduciaries to adopt numerous design tricks into the notice mechanism. Essentially, platforms could still meet the notice compliance by simply informing the user, no matter how convoluted and obscure the design may be. As a result of this, an individual may give consent without completely understanding the policies. Thus, there should be a separate set of rules under the DPDP Act specifically dedicated towards tackling the emergent dark patterns. Section 40 of the DPDP Act is a residuary clause which gives the government the power to make rules consistent with the provisions of the DPDP Act. This is a notable provision which the government can use to make rules for dark patterns.

2.The primary objective of the rules should be to encourage online platforms to establish ethical and responsible design practices. These rules could act as an indicative guidance to platforms on how they should design their user interface. This would include – giving complete and correct information such as disclosing in-app purchases about a product/service (eg: a consumer has downloaded a mobile application for playing candy crush, which was advertised as a free game. However, after 7 days, the app asked for a payment to continue playing. The fact that the free version of the game is available only for a limited time was not disclosed to the consumer at the time of downloading the mobile application). The guidance could also insist on using clear menus, clear fonts, icons and click sequences for easier understanding of the product/service; making sure the default settings are favourable to consumers (eg: a consumer orders an airline ticket. In the booking process, a box saying ‘Yes, I would like to add travel insurance’ has been pre-ticked by default.) Such default selection without user involvement should not be allowed. The consumer must consciously agree to an extra product such as travel insurance. Such rules could also blacklist certain practices and impose dissuasive sanctions.

3.The upcoming rules under the DPDP Act must keep in mind the increasing impact of  dark patterns on privacy. Consequently, it should be noted that consent is not enough to protect data, but there is a need for greater accountability. This could potentially be achieved by using the fiduciary approach which allows information fiduciaries to be held to reasonable and ethical standards of behaviour, based on expectations of users. It would require technology companies to take reasonable steps to secure our data (duties of care); collect only so much data as is necessary to achieve a particular purpose and limit the use of collected data to specific purposes to which users consent (duties of confidentiality); and would require that companies do not profit by harming users (duties of loyalty). Thus, in the background of how disclosure can be manipulated by cognitive biases and coercive design, this approach, one based on the connection between trust and sharing, would hold online platforms to a higher standard of loyalty, confidentiality and care. 

(The opinions expressed in the Blog are personal to the author/writer. The University does not subscribe to the views expressed in the article / blog and does not take any responsibility for the same.)