The Right to be Forgotten – Examining Approaches in Europe and India

This is a guest post authored by Aishwarya Giridhar.

How far does the right to control personal information about oneself extend online? Would it extend, for example, to having a person’s name erased from a court order on online searches, or to those who have been subjected to revenge pornography or sexual violence such that pictures or videos have non-consensually been shared online? These are some questions that have come up in Indian courts and are some of the issues that jurisprudence relating to the ‘right to be forgotten’ seeks to address. This right is derived from the concepts of personal autonomy and informational self-determination, which are core aspects of the right to privacy. They were integral to the Indian Supreme Court’s conception of privacy in Puttaswamy vs. Union of India which held that privacy was a fundamental right guaranteed by the Indian Constitution. However, privacy is not an absolute right and needs to be balanced with other rights such as freedom of expression and access to information, and the right to be forgotten tests the extent to which the right to privacy extends.

On a general level, the right to be forgotten enables individuals to have personal information about themselves removed from publicly available sources under certain circumstances. This post examines the right to be forgotten under the General Data Protection Regulation (GDPR) in Europe, and the draft Personal Data Protection Bill, 2019 (PDP Bill) in India.

What is the right to be forgotten?

The right to be forgotten was brought into prominence in 2014 when the European Court of Justice (ECJ) held that users can require search engines to remove personal data from search results, where the linked websites contain information that is “inadequate, irrelevant or no longer relevant, or excessive.” The Court recognised that search engines had the ability to significantly affect a person’s right to privacy since it allowed any Internet user to obtain a wide range of information on a person’s life, which would have been much harder or even impossible to find without the search engine. 

The GDPR provides statutory recognition to the right to be forgotten in the form of a ‘right to erasure’ (Article 17). It provides data subjects the right to request controllers to erase personal data in some circumstances, such as when the data is no longer needed for their original processing purpose, or when the data subject has withdrawn her consent or objected to data processing. In this context, the data subject is the person to whom the relevant personal data relates, and the controller is the entity which determines how and why the data would be processed. Under this provision, the controller would be required to assess whether to keep or remove information when it receives a request from data subjects.

In comparison, clause 20 of India’s Personal Data Protection Bill (PDP Bill), which proposes a right to be forgotten, allows data principals (similar to data subjects) to require data fiduciaries (similar to data controllers) to restrict or prevent the disclosure of personal information. This is possible where such disclosure is no longer necessary, was made on the basis of consent which has since been withdrawn, or was made contrary to law. Unlike the GDPR, the PDP Bill requires data subjects to approach Adjudicating Officers appointed under the legislation to request restricted disclosure of personal information. The rights provided under both the GDPR and PDP Bill are not absolute and are limited by the freedom of speech and information and other specified exceptions. In the PDP Bill, for example, some of the factors the Adjudicating Officer is required to account for are the sensitivity of the data, the scale of disclosure and how much it is sought to be restricted, the role of the data principal in public life, and the relevance of the data to the public. 

Although the PDP Bill, if passed, would be the first legislation to recognise this right in India, courts have provided remedies that allow for removing personal information in some circumstances. Petitioners have approached courts for removing information in cases ranging from matrimonial disputes to defamation and information affecting employment opportunities, and courts have sometimes granted the requested reliefs. Courts have also acknowledged the right to be forgotten in some cases, although there have been conflicting orders on whether a person can have personal information redacted from judicial decisions available on online repositories and other sources. In November last year, the Orissa High Court also highlighted the importance of the right to be forgotten for persons who’s photos and videos have been uploaded online, without  their consent, especially in the case of sexual violence. These cases also highlight why it is essential that this right is provided by statute, so that the extent of protections offered under this right, as well as the relevant safeguards can be clearly defined.

Intersections with access to information and free speech

The most significant criticisms of the right to be forgotten stem from its potential to restrict speech and access to information. Critics are concerned that this right will lead to widespread censorship and a whitewashing of personal histories when it comes to past crimes and information on public figures, and a less free and open Internet. There are also concerns that global takedowns of information, if required by national laws, can severely restrict speech and serve as a tool of censorship. Operationalising this right can also lead to other issues in practice.

For instance, the right framed under the GDPR requires private entities to balance the right to privacy with the larger public interest and the right to information. Two cases decided by the ECJ in 2019 provided some clarity on the obligations of search engines in this context. In the first, the Court clarified that controllers are not under an obligation to apply the right globally and that removing search results for domains in the EU would suffice. However, it left the option open for countries to enact laws that would require global delisting. In the second case, among other issues, the Court identified some factors that controllers would need to account for in considering requests for delisting. These included the nature of information, the public’s interest in having that information, and the role the data subject plays in public life, among others. Guidelines framed by the Article 29 Working Party, set up under the GDPR’s precursor also provide limited, non-binding guidance for controllers in assessing which requests for delisting are valid.

Nevertheless, the balance between the right to be forgotten and competing considerations can still be difficult to assess on a case-by-case basis. This issue is compounded by concerns that data controllers would be incentivised to over-remove content to shield themselves from liability, especially where they have limited resources. While larger entities like Google may have the resources to be able to invest in assessing claims under the right to be forgotten, this will not be possible for smaller platforms. There are also concerns that requiring private parties to make such assessments amounts to the ‘privatisation of regulation’, and the limited potential for transparency on erasures remove an important check against over-removal of information. 

As a result of some of this criticism, the right to be forgotten is framed differently under the PDP Bill in India. Unlike the GDPR, the PDP Bill requires Adjudicating Officers and not data fiduciaries to assess whether the rights and interests of the data principal in restricting disclosure overrides the others’ right to information and free speech. Adjudicating Officers are required to have special knowledge of or professional experience in areas relating to law and policy, and the terms of their appointment would have to ensure their independence. While they seem better suited to make this assessment than data fiduciaries, much of how this right is implemented will depend on whether the Adjudicating Officers are able to function truly independently and are adequately qualified. Additionally, this system is likely to lead to long delays in assessment, especially if the quantum of requests is similar to that in the EU. It will also not address the issues with transparency highlighted above. Moreover, the PDP Bill is not finalised and may change significantly, since the Joint Parliamentary Committee that is reviewing it is reportedly considering substantial changes to its scope.

What is clear is that there are no easy answers when it comes to providing the right to be forgotten. It can provide a remedy in some situations where people do not currently have recourse, such as with revenge pornography or other non-consensual use of data. However, when improperly implemented, it can significantly hamper access to information. Drawing lessons from how this right is evolving in the EU can prove instructive for India. Although the assessment of whether or not to delist information will always subjective to some extent, there are some steps that can be taken provide clarity on how such determinations are made. Clearly outlining the scope of the right in the relevant legislation, and developing substantive standards that are aimed at protecting access to information, that can be used in assessing whether to remove information are some measures that can help strike a better balance between privacy and competing considerations.

The General Data Protection Regulation and You

By Aditya Singh Chawla

A cursory look at your email inbox this past month presents an intriguing trend. Multiple online services seem to have taken it upon themselves to notify changes to their Privacy Policies at the same time. The reason, simply, is that the European Union’s General Data Protection Regulation (GDPR) comes into force on May 25, 2018.

The GDPR marks a substantial overhaul of the existing data protection regime in the EU, as it replaces the earlier ‘Directive 95/46/EC on the protection of individuals with regard to the processing of personal data and on the free movement of such data.’ The Regulation was adopted by the European Parliament in 2016, with a period of almost two years to allow entities sufficient time to comply with their increased obligations.

The GDPR is an attempt to harmonize and strengthen data protection across Member States of the European Union. CCG has previously written about the Regulation and what it entails here. For one, the instrument is a ‘Regulation’, as opposed to a ‘Directive’. A Regulation is directly binding across all Member States in its entirety. A Directive simply sets out a goal that all EU countries must achieve, but allows them discretion as to how. Member States must enact national measures to transpose a Directive, and this can sometimes lead to a lack of uniformity across Member States.

The GDPR introduces, among other things, additional rights and protections for data subjects. This includes, for instance, the introduction of the right to data portability, and the codification of the controversial right to be forgotten. Our writing on these concepts can be found here, and here. Another noteworthy change is the substantial sanctions that can be imposed for violations. Entities that fall foul of the Regulation may have to pay fines up to 20 million Euros, or 4% of global annual turnover, whichever is higher.

The Regulation also has consequences for entities and users outside the EU. First, the Regulation has expansive territorial scope, and applies to non-EU entities if they offer goods and services to the EU, or monitor the behavior of EU citizens. The EU is also a significant digital market, which allows it to nudge other jurisdictions towards the standards it adopts. The Regulation (like the earlier Directive) restricts the transfer of personal data to entities outside the EU to cases where an adequate level of data protection can be ensured. This has resulted in many countries adopting regulation in compliance with EU standards. In addition, with the implementation of the GDPR, companies that operate in multiple jurisdictions might prefer to maintain parity between their data protection policies. For instance, Microsoft has announced that it will extend core GDPR protections to its users worldwide. As a consequence, many of the protections offered by the GDPR may in effect become available to users in other jurisdictions as well.

The implementation of the GDPR is also of particular significance to India, which is currently in the process of formulating its own data protection framework. The Regulation represents a recent attempt by a jurisdiction (that typically places a high premium on privacy) to address the harms caused by practices surrounding personal data. The lead-up to its adoption and implementation has generated much discourse on data protection and privacy. This can offer useful lessons as we debate the scope and ambit of our own data protection regulation.

Aditya is an Analyst at the Centre for Communication Governance at National Law University Delhi

The Right to be Forgotten: Balancing Personal Privacy with the Public’s right to access Information

By Kasturika Kaumudi 

Evolution of the right and Global framework

In the Internet age, when access to information is quick and easy, procuring personal information or past records about an individual is no longer a herculean task. The relevance of such information or the duration for which such data should be available for public access has hitherto not been debated.

There is growing global debate on a new right called “the right to be forgotten” or “the right of erasure”. This right allows people to request for removal of their personal information/data online after a period of time or if such information/data is no longer relevant. The origin of this right can be traced back to the French jurisprudence on the ‘right to oblivion’ or droit à l’oubli. The rationale behind this right was to allow criminal offenders who have already served their sentence to object to the publication of information regarding their crime and conviction. This was done to ease their process of social integration.

It was along these lines that the 1995 EU Data Protection Directive acknowledged the right to be forgotten. Under the Directive, it was stipulated that the member states should give people the guaranteed right to obtain from the ‘controller’ the rectification, erasure or blocking of data relating to them, the processing of which does not comply with the provisions of the Directive. The term ‘controller’ here refers to a natural or legal person, public authority, agency or any other body which alone or jointly determines the purposes and means of processing personal data.

In May 2014, the Court of Justice of the European Union (‘Court’) recognized the right to be forgotten as a part of the fundamental right to privacy in the Google case. The plaintiff, in this case, had requested for delinking of search results appearing on Google and the deletion of newspaper articles appearing online with respect to bankruptcy proceedings against him. The Court held that individuals have a right to request search engines to delink information which causes prejudice to them. However, the Court was careful to state that this right is not absolute and can be applied only when the data becomes ‘inadequate, irrelevant, excessive, not kept up to date, or kept for longer than necessary’ with respect to the purpose for which it was collected or processed. Accordingly, the Court directed Google to delink the search results in the instant case. It was further held that the publication of accurate data may be lawful at a given point in time, but in due course, it might become inconsistent with the law.

While the judgment in the Google case is a step in the right direction, it leaves much to be desired. The Court did not set out any guidelines or parameters to filter out information as ‘inadequate’ or ‘irrelevant’ or ‘excessive’. It has thrust the onerous task of balancing the right to privacy of an individual and the public’s right to access information on private search engines like Google. This raises critical questions regarding the suitability of private entities taking decisions which are of constitutional import. Pursuant to this judgment, the EU adopted the Data Protection Reforms which includes the right to be forgotten as an essential right under Article 17 of the Data Protection Regulations. This lays down the conditions for application of the right to be forgotten, and requires entities processing personal data to inform third parties regarding requests for erasure of links to any personal data. A detailed discussion of these regulations and their impact on India can be found here.

Challenges in enforcement

There are many legal and technical challenges in the enforcement of the right to be forgotten. The success rate of governments across the world in banning or removing pornographic websites or torrent sites from the Internet has not been great, since there are various ways of circumventing such bans. Further, the blocking or delinking of URLs by search engines does not guarantee that such information has been blocked or deleted from the Internet. There is also no way to ensure that such information is not uploaded again.

To enforce the ruling of the case discussed above, Google has created a mechanism through which an individual can make a request for taking down of or delinking of a specific search result bearing an individual’s name. Google evaluates such requests on various parameters like whether these results are an infringement on his right to privacy or whether such information is of public interest. In case of the former, the individual’s right to be forgotten trumps the public’s right to access information. However, if the information is of public interest, the right to information of the public prevails over privacy rights. This squarely makes Google the decision maker of the relevance, adequacy, and need for data to be available online for public access or not.

With the growing recognition of the right to be forgotten, the number of requests that search engines receive for taking down or delinking is only likely to increase, making it extremely difficult and cumbersome to scrutinize such requests manually. According to Google’s Transparency Report, as on 9th October, 2016, Google had received 565,412 requests for the removal of URLs. The Report further states that it has already evaluated 1,717,714 URLs since May, 2014. The Report shows that Google has removed 43.2% of the URLs from the requests received. With a substantial increase in the number of requests, search engines may even consider using algorithms to deal with such requests instead of manually evaluating the privacy rights vis-à-vis public interest.

Further, search engines are also likely to tread on the side of caution and accept such requests rather than face expensive legal challenges across jurisdictions for non-compliance. This right may be misused by individuals as it will lead to artificial alteration of the content available online which may result in the delinking of pertinent information.

Recent developments in India

The data protection regime and data privacy laws of India are not comprehensive and dynamic enough to respond to technological advances in the modes of collection, transfer and use of personal information. The Information Technology Act, 2000 and the rules framed under the Act make up the primary legal framework that governs this subject. The Delhi High Court is currently hearing a matter (Laksh Vir Singh Yadav vs. Union of India, WP(C) 1021/2016) where the petitioner has requested for the removal of a judgment involving his mother and wife from an online case database. The petitioner claims that the appearance of his name in the judgment is causing prejudice to him and affecting his employment opportunities. It will be interesting to see the outcome of this case and how the larger debate of the right to privacy of an individual versus the right of public to access information unfolds in this case.

It is pertinent to note that the Delhi High Court is dealing with the request for removal of a court order which is a public document. This request is unusual and distinct from a request for delinking of search results appearing in search engines like Google since such delinking does not result in the removal of the information itself. Allowing the removal of such judgments from online case databases could result in the expunging of public records. Furthermore, the removal of judgments from online public databases will obstruct public access to case materials shedding light on critical questions of law.

While implementing the right to be forgotten, a very fine balance has to be struck between the right to freedom of speech and expression, public interest and personal privacy. To balance these conflicting rights, the judiciary may consider implementing a system where personal information like names, addresses etc. of the litigants are redacted from reportable judgments/orders especially in personal disputes. The courts have, in the past, refrained from divulging the identities of parties in order to respect their privacy in many rape or medico-legal cases.

With many unanswered questions surrounding this right, India requires a comprehensive data protection regime to regulate the entities collecting and processing personal data and to define the terms of use, storage and deletion of such personal data. This will ensure that such entities are obliged to take due care of the personal data in their possession and will also provide a framework for dealing with requests for removal or erasure of such personal data.

Kasturika Kaumudi is a Programme Officer with the Centre for Communication Governance at National Law University Delhi

Digital Memory & Informational Privacy: Reflecting on the EU’s ‘Right To Be Forgotten’- Working Paper by Ujwala Uppaluri

As part of a complete overhaul of European Union regulations concerning Internet information stored electronically, a proposal for a ‘General Data Regulation’ was been passed by the European Parliament. The Regulation is intended to be read with the existing law as to data protection in the European Union, specifically the Data Protection Directive and the E-Privacy Directive. Inter alia, this legislative attempt made reference at its Article 17 to a data subject’s right to be forgotten. The proposal sparked a staggering amount of debate around the consequences of the grant of such a right, with particular resistance arising out of the potential burden that such a right could impose on intermediaries online.

Since that proposal was made, a ‘right to be forgotten’ has been articulated by the Court of Justice of the Eurpoean Union (CJEU). It used existing data protection law, including portions of the Data Protection Directive of 1995 to read in a right to be forgotten for data subjects, and a corresponding obligation to takedown for intermediaries, and search engines in particular. As with Article 17, Costeja has been the subject of a great deal of criticism.

This paper will cursorily consider the history and nature of machine memory, make the case for digital forgetting, describe the legal and conceptual sources of the right to be forgotten, and evaluate Article 17 and the CJEU’s iteration of the right, with the intention of contributing to this debate. Particular emphasis will be placed, in the process, on informational privacy on the fundamentals of data protection and on the many concerns that the present iteration of the right raises not only for Europe but for data protection law generally.

The Complete Paper can be found here: https://drive.google.com/file/d/0BwY1OLu_H1ICRTRaWEtTOVFFVlU/view?usp=sharing

(Ujwala Uppaluri was a Fellow at CCG from June 2014 to April 2015 and will be joining Harvard Law School to pursue her LL.M. from August 2015.)