Aarogya Setu app lacks clear legal backing and limits, tends towards surveillance

This post first appeared in Deccan Herald on May 9, 2020

Aarogya Setu app. (Reuters Photo)

In April, the Indian government’s National Informatics Centre (NIC) launched the ‘Aarogya Setu’ mobile application as a part of the government’s efforts to contain the spread of COVID-19. On May 12, the Kerala High Court will hear arguments in a petition challenging the central government’s notification making the use of Aarogya Setu mandatory for every person in a containment zone, and all public and private sector employees.

Aarogya Setu is supposed to be a tool that enables better ‘contact tracing’ – one of the key elements in fighting infectious diseases. Apps and other technological solutions with similar purposes have sprung up in many countries over the past two months, as we all come to terms with the impact of the COVID-19 pandemic.

These apps have been watched warily by academics, researchers and advocates who work on healthcare, technology and privacy. However, as the only democratic country to make the contact tracing app mandatory for a large section of its population, it is more important than ever to look at whether the use of this app is actually in India’s interests.

There are two key legislations under which much of India’s response to the COVID-19 pandemic has been housed – the Disaster Management Act, 2005, and the Epidemic Diseases Act, 1897. The root of the problem lies in the fact that neither actually anticipates in any level of detail the circumstances we are in today. Frameworks for contact tracing, or the use of Aarogya Setu, where they exist, are based largely on executive orders. They also do not always adhere to the rights-based standards set out by the Constitution.

The process of contact tracing requires that if an individual is found to be infected with the disease, their contacts are identified and traced, in order to follow the spread of the disease, and contain it. The collection of personal information of the individual who is infected, as well as of the people they have been in contact with, are essential elements – leading to an intrusion of the privacy of all these individuals.

While the Constitution provides for a fundamental right to privacy, as most notably articulated by the Supreme Court in the famous Puttaswamy judgement, we do not have a data protection law yet. It is the duty of the State to ensure that privacy intrusive actions meet the Constitutional tests that help define the circumstances in and the extent to which an intrusion of individual privacy is acceptable. A public health emergency is an obvious answer. However, the requirements under law are more specific.

The first test is legality – whether there is a law that enables such intrusive action. Such a law should ideally be passed by Parliament, be clear and not arbitrary in nature. This allows individuals to know what to expect in terms of any violations of rights such as privacy. 

Other tests include those of necessity and proportionality. The State must show that the intrusion of privacy is necessary to meet a legitimate aim (public health in this case), that the kind of intrusion that is taking place is the least privacy restrictive measure available, and that the extent of intrusion is proportional to the problem at hand. State actions should also be subject to judicial, or at the very least institutional, oversight mechanisms that are transparent and accessible.

In the context of COVID-19, India does have a broader ‘Integrated Disease Surveillance Programme’ meant to guide contact tracing and other measures to contain the spread of infectious diseases. This provides for a largely individualised and targeted contact tracing process that requires trained workers to collect and analyse information. The framework has been set up by the Ministry of Health and Family Welfare (MoHFW) over several years. However, it does not appear to account for privacy and privacy and data protection concerns.

Aarogya Setu supposedly amplifies this tracing process. It requires every user of the app to provide information about their location history (GPS and Bluetooth), along with personal information that can identify them. This information is maintained either on your phone or on servers managed by the government. If it is found that you may have been in close proximity with a person infected with the virus, then the personal information you provide may be used to contact you, and for further steps to be taken.

While this does seem like an efficient way to speed up the contact tracing process, it is not conventional contact tracing in itself. It also demands a much higher threshold in terms of privacy protections, by virtue of being a mass information collection exercise as opposed to a targeted exercise.

The Aarogya Setu app is governed only by its own privacy policy, and the orders mandating its use. There is no legal framework that regulates the use of the app and the data collected. The privacy policy states that personal data collected using the app is limited, is used only for the purpose of contact tracing and COVID-19 response efforts, and will be deleted in a timely manner.

However, the app and its privacy policy leave many questions unanswered – the first being, which government agencies manage the app, and have access to the data to begin with. The MoHFW is central to the COVID-19 response, and ostensibly should have access. However, the app is run by the NIC, and the orders mandating the app are issued by the Ministry of Home Affairs.

There is no clear process for deletion of an account on the app, or timeline for winding up the use of the app. Transparency and accountability measures are limited to official statements about the app’s security and privacy policies, and there are no clear means for oversight of any kind.

At a time when many other countries are moving towards limiting the data collected and processed by contact tracing apps, Aarogya Setu’s reach is expanding beyond contact tracing support. The app is being used to self-certify health status, issue health advisories, and provide telemedicine services. Officials have suggested that it could be used to issue curfew passes, and become a one-stop shop for all COVID-19 related government services. Some have gone so far as to suggest that the app can act as a base for a national health stack – a centralised registry of electronic health records proposed by the Niti Aayog in 2018.

While each of these services may independently be useful, the mixing of objectives muddles the purposes for which personal information on the app is collected, the limitations on storage of such data, as well as the parties that need access to such information. Without clear legal frameworks, such extensive and mandatory use of Aarogya Setu may set us down the path of weakened privacy protections for health and location data for years to come. This kind of function creep also signals potential for pervasive extra-legal surveillance.

Surveillance Without Safeguards In The Pandemic

South Korea, Canada and the EU changed laws to address privacy concerns while using health data to combat Covid-19. In India, overbearing surveillance without safeguards is becoming the new normal, ignoring privacy incursions, ignoring pitfalls of drones, intrusive apps and cellphone data

By Smitha Krishna Prasad & Kritika Bhardwaj

This post first appeared on Article 14 on April 30, 2020

Restaurant aggregator Zomato has made it mandatory for delivery partners to install the Aarogya Setu app/TWITTER

With 30,000 known cases of Covid-19 and deaths approaching 1,000, despite a nationwide lockdown of more than a month, correct and adequate data is required for the government to be able to decide the correct course of action with respect to testing, quarantining and to decide how and when restrictions can be lifted. 

Evidence-based decision making also helps build public trust, which ensures better compliance with the government’s measures. But to achieve these ends, the Centre and states have resorted to emergency-like powers to collect and share personal information, drones and cell-phone location data to enforce quarantines and published lists with granular details of those under quarantine. While the need for extensive information is undoubtedly important, the means adopted merit closer scrutiny.

The Global Use Of Data To Combat Public-health Crises

Globally, lawmakers recognise that states and healthcare providers may find it difficult to comply with the rigours of data protection, or that the need for data may outweigh privacy concerns in times of public health emergencies.

Data protection and privacy laws, therefore, almost always provide for exceptions to the use of personal information in the interest of public health. However, these are not blanket exemptions; increased surveillance on the ground of securing public health must be necessary to  prevent or contain the spread of the disease, and they must be the least restrictive means available to the state. 

A necessary corollary to these principles is that such enhanced surveillance must be rolled back when the crisis subsides, with rigorous controls over how the data collected during this time is to be archived or used.

Several governments have ensured that the use of data and broader disease-surveillance activities are compliant with these principles. Examples of such measures can be found in pre-existing plans that some jurisdictions have had in place in anticipation of epidemics. In other cases, such examples are seen in the way privacy laws have been adapted in the face of the current situation.

In South Korea, for instance, the law guiding prevention and control of disease outbreaks was  amended after the outbreak of MERS (Middle East Respiratory Syndrome) in 2015, to enable government authorities to collect data from a number of sources. The law provides that each person whose data was collected is notified of the collection and processing of data. The data must also be destroyed after it has been processed. 

Similarly, in Canada, the government explained in March 2020 how data could be collected and processed legally, using exceptions already available. In the European Union, European Union (EU) and national authorities  have done the same, applying public-health exemptions under the EU data-protection law. 

These laws and statements emphasise on the right of citizens to know, and the need for government authorities to be transparent in their use of data.

India’s Response: Surveillance Without Safeguards

In India, none of the extreme measures that central and state authorities have used appear to consider personal privacy and the need for safeguards. 

For example, reports indicate that immigration data of passengers returning to India was shared with local police authorities and health workers. To enforce quarantines, this data was then publicly shared or shared with private companies to create applications for surveillance. 

Both instances are problematic. Given the stigma associated with the disease, publicly sharing such personal information may result in discrimination or harassment of those under quarantine. Similarly, sharing this information with private companies in the absence of a legal basis or clear terms of service and privacy policies suggests that companies may hold on to this data and use it for unrelated purposes later on, without the person’s knowledge or consent. 

The use of drones to enforce quarantine also appears to be without any legal basis. Individuals are at the risk of being surveilled remotely, without their knowledge or consent, and with little information about who is operating these drones and how they will use the footage collected. Where the police deploy drones, there is danger of such data being retained in perpetuity for purposes entirely unrelated to disease surveillance. 

Similarly, Aarogya Setu, the application being promoted by the central government to carry out contact tracing, relies on cell-phone location data to trace an infected person’s contacts. 

Cell-phone location data is inherently revealing: it doesn’t just reveal a person’s movement but also helps determine their employment details, financial status, personal preferences, and even information about other members of a person’s family. The app’s policy states that information will be stored on a centralised government server, and is extremely vague about how long this data can be retained, raising concerns about its usage once the pandemic is over. 

Nearly three weeks into a nationwide lockdown, the Centre—for the first time—issued an advisory, warning people against sharing personal details of those under quarantine. It remains unclear whether this advisory applies to state agencies too and how it will be enforced.

In several states, governments have framed regulations under the Epidemic Diseases Act, 1897—a colonial-era law that gives state governments wide discretionary powers to prevent the outbreak or spread of an epidemic. While conferring wide powers on government officers, none of the regulations analysed by the authors had any restrictions on the use of data or any other procedural safeguards oriented towards privacy. 

Even once the epidemic blows over and strict enforcement is no longer required, governments and researchers may continue to legitimately require data to understand the disease better, to address infrastructural gaps or for development of vaccines. 

On 27 March, the Ministry of Health and Family Welfare announced a four-year long ‘India COVID-19 Emergency Response and Health Systems Preparedness Project’ to study its response to the COVID-19 threat and strengthen its national systems for better preparedness. The Project will necessarily depend on data from testing facilities and hospitals; this will inevitably include personal data of those tested, quarantined, recovered and hospitalised. However, the preliminary plan released by the Government makes no mention of the practices to be followed for the collection and use of data, how long it can be stored and whether safeguards,such as anonymisation or pseudonymisation, will be applied to the datasets before use. 

Reality Vs Privacy: What Could, Should Be

The right to privacy in the context of healthcare services has long been recognised in India.

Early in 2020, the Personal Data Protection Bill, 2019, was in the news, in the midst of a broader push towards a legal framework around the right to privacy in India. This proposed law deals with the collection, storage and processing of personal data. This Bill also recognises the need for appropriate exceptions to the data collection process in the case of public health or medical emergencies. 

Similarly, discussions around the use of digital technology and data to improve healthcare systems, and particularly disease surveillance and control are not new in India. The National Centre for Disease Control, and its Integrated Disease Surveillance Programme have been looking into this since at least 2007. An Integrated Health Information Platform was established in 2018, after a review of existing systems in 2015. 

However, much of the discussion around the use of technology for surveillance to respond to the Covid-19 crisis does not seem to account for the disease-surveillance plans that have been in place prior to the crisis. 

On the one hand, there is talk of the nationwide deployment of tracing technology, without information about the privacy impact on large sections of the population or indeed their ability to access this technology. On the other, we see that the limited procedural requirements we do have in place to facilitate public-private partnerships, appear to be abandoned in the rush to find solutions. Across the board, the call to develop and use new technology and data to respond to this crisis, appears to be divorced from the ongoing efforts discussed above. 

A preliminary look at the documentation around existing disease surveillance programs suggests that privacy and data-protection concerns have not been prioritised in these discussions. In addition, it appears that disease-surveillance measures are no longer limited to those provided for under existing legal and policy frameworks.

Personal information is being indiscriminately collected, shared and used by multiple agencies, including state police and private players. Collectively, these entities not only have access to people’s sensitive health information, but also information about their location, whether they broke quarantine, and in some cases, a database of their photographs

There is little clarity on what happens to this data once the pandemic ends. The absence of an overarching data-protection legislation does complicate things but does not grant states a free license to adopt any and all measures in the garb of public health. 

It does not mean that extraordinary surveillance measures taken to combat the epidemic can be normalised either. As we move forward in dealing with this crisis and preparing for the future, we must remember that human rights, including the right to privacy, go hand in hand with public interest.   

Kritika Bhardwaj is a lawyer and a Fellow at the Centre for Communication Governance at National Law University Delhi. Smitha Krishna Prasad is Associate Director at the Centre for Communication Governance at National Law University Delhi

Personal Data Protection Bill, 2019: Protecting Children’s Data Online

This post originally appeared on Medianama on January 16, 2020

Decades after the first data protection laws were implemented, we still continue to struggle with some of the basics of personal data protection — how should personal data be defined? Corollary to this, is the question of whether some personal data matters more than others.

Many privacy scholars now agree that the relevance and sensitivity of personal data should be determined by context. The idea is to protect individuals or groups of individuals in contexts where certain personal information makes them more vulnerable. Implementing this kind of protection under the law however, has not been as easy given how often ‘context’ and therefore sensitivity changes. The compromise that many lawmakers have found seems to be the categorisation of personal data and sensitive personal data – all data is to be protected, but certain types of data are sensitive irrespective of context and require increased protection. The brand new Indian Personal Data Protection Bill, 2019 (PDP Bill), also uses this approach, and lists out a number of categories of data as ‘sensitive personal data’.

While it is debatable whether sensitive personal data should be listed out, or a more dynamic approach is better — one of the few areas where there is consensus that additional protections are required irrespective of context, is in the case of personal information of children.

In this article, we look at what the PDP bill says about collection and processing of children’s personal data, what this actually means in practice, and how this stacks up against global or best practices.

Chapter IV of the PDP Bill deals with the personal data and sensitive personal data of children.  It provides for a broad requirement that data fiduciaries must process personal data of children in a manner that protects the child’s rights, and is in the best interests of the child [Section 16(1)].  The rest of the chapter can be divided into two sections – the first, the provisions that go into when and how consent should be provided for the processing of such data, and who can provide such consent. The second issue this chapter focuses on is specific types of processing that will not be permitted in relation to children’s personal data.

The PDP Bill considers any person below the age of 18 a child, in line with Indian laws on the age of majority, which require a person to be 18 years of age to enter contracts, vote etc. In this context, it is not surprising that one of the areas of focus in the PDP Bill’s discussion on children and children’s rights in the context of personal information is the issue of consent, that is, a contract between the data fiduciary and the person providing the consent.

What happens when children want to access online services?

First, the data fiduciary must verify the age of the child, and ensure that consent is obtained from the parent or guardian. The PDP Bill itself does not go into detail on how this is to be done — except that there will be regulations on how age verification is to be undertaken. In prescribing these age verification mechanisms, a number of criteria will be taken into consideration, including: the volume of personal data processed; how much of that personal data is likely to be that of a child; and whether it is possible that there will be any harm to the child from the processing of such personal data. Regulations will also classify data fiduciaries that operate commercial websites or online services directed at children, or process large volumes of personal data of children, to be ‘guardian data fiduciaries’.

How will the PDP Bill affect apps like PUBG?

At this stage, while we will have to wait to see what the regulations look like, it is likely that services popular with younger age groups, will at least need to implement age verification mechanisms, if not act as guardian data fiduciaries. A good example here would be gaming apps — these are sometimes targeted specifically at children, but many of the more popular apps are widely used by children and adults alike. A risk of harm to children using these services also may exist, leading to the classification of such service providers as guardian data fiduciaries.

While in the past we have seen some extreme reactions to such services, such as the recent ban on PUBG, a gaming app that is popular globally, and popular specifically among children, might need to tweak some services under the new PDP Bill. First, age verification and consent mechanisms will need to be modified. The age of consent specifically for online services is higher in India than in many other jurisdictions – the US and the UK for instance allow children between the ages of 13 and 18 to provide consent for some services, often depending on the competence of the child to consent. However, India doesn’t seem to be adopting this system of differentiated ages of consent, except in the case of data fiduciaries providing counselling or child protection services to a child, in which case no consent is required from the parent or guardian.

The PDP Bill calls for a consent manager system to be implemented, in order to battle some of the common problems we see with the informed consent system. It may be worthwhile to look at whether this can be used as a specific solution in the case of children’s consent as well. However, given that there is little information on what the role of consent managers will be, or how age verification systems will be implemented, we may need to wait until the data protection authority starts discussing regulations to engage more on this subject.

What can be effective age-gating mechanisms?

The kind of age verification system that needs to be adopted, will be discussed in more detail in the regulations. Implementing effective age verification systems is an ongoing problem, given that some of the simple age verification systems are often easy to circumvent. The same case can also be made for systems in place to obtain the consent of the parent or guardian, once it is established that the personal data of a child is being collected.

A popular solution for this purpose is to implement knowledge-based tests. For instance, an age verification system that relies on arithmetic tests, could in theory verify the age of the person consenting, if there is an expectation that children will not be able to make such calculations – but this may not be the case for older children. However, more extensive knowledge-based tests often end up collecting more personal data than required – it is important to keep in mind that data protection principles need to be applied to this testing system itself.

What will happen if a guardian data fiduciary defaults on its responsibilities?

The second requirement under Chapter IV of the PDP Bill is applicable specifically to ‘guardian data fiduciaries’ – these data fiduciaries are not permitted to engage in profiling, tracking, behavioural monitoring of children or direct targeted advertising at children. They are also barred from undertaking any other activities that may cause significant harm to a child. This provision is likely meant to address situations like the recent case in the US, where Google/YouTube was fined USD 170 million for knowingly profiting from the use of personal data of children by directing targeted advertising at them. Similar to the consent requirements, an exception may be made (in the regulations) for data fiduciaries that offer counselling or child protection services.

Under the PDP Bill, any violation of the provisions of Chapter IV could result in a penalty up to INR 15 crore or 4% of the worldwide turnover of the data fiduciary for the preceding financial year (whichever is higher).

Is Chapter IV of the PDP Bill enough?

As discussed above, it is important that children have additional protections against the processing of their personal data, especially where such processing goes against their interest. To this end, the two broad purposes of Chapter IV of the PDP Bill — (i) ensuring that age verification mechanisms are in place, and (ii) barring the profiling and tracking of children, the monitoring of children’s behaviour, and targeting of advertisement to children — are commendable.

However, a few issues jump out immediately on the reading of these provisions. Some may be fixed by restructuring the provisions of the PDP Bill, but others possibly require a more thorough understanding of children’s rights.

  1. The first issue, is the idea of guardian data fiduciaries — the purpose for this classification appears to be to identify those data fiduciaries that are more likely to be processing children’s personal data, resulting in harm to such children. However, the concept of guardian data fiduciaries is not really put to use beyond such classification. They have no additional obligations, other than the bar on profiling, tracking and monitoring of children’s data, and targeting of advertising towards children, and other processing that may cause significant harm to children. It is not clear here why this bar is limited to guardian data fiduciaries — are those data fiduciaries that don’t meet this threshold then allowed to engage in these activities that may result in significant harm to children?
    • It may be more useful to equate guardian data fiduciaries to significant data fiduciaries under the PDP Bill — undertaking some of the additional compliances applicable to significant data fiduciaries, such as data protection impact assessments may help identify harms to children, and then ways to protect them, better.
  2. Second, the issue of differentiated ages of consent and the definition of what causes harm to children. With children of all ages using technology, and particularly online services, it is important to acknowledge that there is a sliding scale of sorts when it comes to harms to different age groups of children. This has been recognised by the National Commission for Protection of Children’s Rights, which provides separate guidelines to ‘older children’ in its guide to online safety of children. It is important in this context, to look at children’s data protection under the PDP Bill, from the larger lens of children’s safety, as well as the agency and rights of children. For instance, it is important that the consent of a guardian or parent is obtained before a 10-year-old is able to access an online service, and provide personal data to the data fiduciary. However, there may be different concerns at play in protecting a 17-year-old, who may require privacy protections not only from the data fiduciary, but parents and guardians as well. While the exception for counselling and child protective services is useful in this context, practical concerns regarding the accessibility of such services and the relevance of harm in these contexts need to be accounted for.
  3. Third, the absence of sensitive personal data in the discussion of children’s data. Although the title to chapter IV indicates that it addresses sensitive personal data of children, Clause 16 itself has no mention of sensitive personal data. There will undoubtedly be situations where the sensitive personal data of children will need to be processed by a data fiduciary. It is important that this is accounted for and addressed under the law.

How Press Note 2 will govern Ecommerce marketplaces in India

This post originally appeared on Medianama on February 04, 2019

On December 26, 2018, the Department of Industrial Policy and Promotion, Ministry of Commerce and Industry (DIPP) published Press Note 2 of 2018 (PN 2) which amends and tightens existing regulations applicable to e-commerce entities that have foreign direct investment. PN 2 will amend India’s existing Foreign Direct Investment (FDI) Policy, 2017, with effect from February 1, 2019.

While trying to ready themselves for compliance with the conditions under PN 2, some of the big marketplaces such as Amazon and Flipkart asked for an extension of time before it comes into effect, since their business models will have to change substantially. Others, including the Confederation of All India Traders welcomed the changes to the FDI Policy and opposed extension of timelines for its implementation. However, with no extension on the date of implementation despite multiple requests, the conditions prescribed in PN 2 are now in effect.

FDI Policy

The FDI Policy regulates the nature and amount of foreign direct investment permissible in Indian industry. It identifies how much foreign investment is permitted in companies in different sectors (in terms of percentage of equity), whether there is need for government approval for such investment, and places certain conditions to be followed by the companies which have such foreign investment.

In the case of e-commerce (defined as the buying and selling of goods and services including digital products over digital and electronic networks), the FDI Policy identifies two types of e-commerce undertakings. The first is inventory based, akin to a B2C model, and the second marketplace based – also considered to be a B2B model.

An inventory-based model is one where the e-commerce company owns the goods and services, and sells them directly to the customer – no FDI is permitted at all in such companies.

A marketplace-based model is one where the e-commerce entity only provides the digital platform (on a digital / electronic network), and acts as a facilitator between the customer and the seller. Upto 100% foreign direct investment is permitted in such companies. However, certain conditions have been imposed on the way these entities can operate, in order to level the playing field between traditional brick and mortar stores, and India’s growing tech / e-commerce industry, which has attracted billions of dollars in foreign investments over the past few years.

Under the existing FDI Policy, any e-commerce entities that have foreign direct investment are only allowed to undertake B2B transactions with the sellers registered on the marketplace. This includes provision of services such as warehousing, logistics, order fulfilment, payment collection etc. However, the marketplace entity is not allowed to exercise ownership over the inventory sold on the marketplace, or influence the sale price of the goods / services directly or indirectly. The responsibility for the sale, delivery of goods / services, and any warranty will remain solely with the seller.

The definition of e-commerce under the FDI Policy would cover most online services that we use today – including food delivery services, video and music streaming services, ride sharing and other ‘over the top’ services, in addition to the retail marketplaces that one immediately associates with the concept of ecommerce.

However, the FDI Policy provides a broad exemption from the conditions applicable to e-commerce entities, in the case of sale of services through e-commerce, a category under which most non-marketplace actors would fall. These service providers are required to comply with any conditions in the FDI Policy applicable to the services sector. For instance, entities that engage in broadcasting news / current affairs TV content are subject to stricter conditions under the FDI Policy. Therefore, if content streaming services (which can be considered to provide a service and not sell products to their customers) wish to stream content from news channels, certain conditions may be applicable in order to make them eligible to have FDI.

Press Note 2: Amendments to the FDI Policy

PN 2 now seeks to strengthen provisions applicable to the e-commerce sector. The DIPP (in a response issued in relation to media reports on PN 2), clarified that despite the provisions existing in the FDI Policy, several complaints had been made that e-commerce entities were influencing sale prices, or controlling inventory, and indirectly violating the FDI Policy. In order to address this, PN 2 introduces a couple of important changes:

First, it looks at the different means of exercising control over the inventory. In addition to the bar on direct ownership of inventory by the marketplace e-commerce entity, PN 2 provides some additional conditions:
(a) if the marketplace e-commerce company or its group companies hold equity in a vendor, or control the inventory of a vendor, such vendor will not be allowed to register as a seller on the marketplace platform run by the e-commerce company; and
(b) a somewhat vaguely drafted condition, that many initially reported re-hashed existing conditions under the FDI Policy, by providing that if more than 25% of the sales on a marketplace are undertaken by one seller, then its inventory will be deemed to be controlled by the marketplace e-commerce company, and such sales will therefore be barred. However, a more literal reading of the condition would provide that if the seller purchases more than 25% of its inventory from the marketplace e-commerce company or its group companies, then its inventory will be deemed to be controlled by the e-commerce company.

It is likely that the DIPP intended to impose this latter condition, in order to address market practices where for instance, an affiliate / group company of marketplace entity X, would sell significant quantities of retail products to the certain sellers (on a B2B basis), and these products were then sold by the sellers on the marketplace platform run by X itself. In this case, the existing condition that an e-commerce company should not permit more than 25% of sales on the marketplace to be made by one vendor / its group companies, will no longer apply.

Second, PN 2 additionally looks at the relationship between the marketplace e-commerce entity and the sellers registered on its platform. It provides that:
(a) the services that the e-commerce company provides to sellers should be fair and not discriminatory, i.e. the same terms of service should be offered to all vendors in similar circumstances;
(b) the services should be provided on an arm’s length basis;
(c) where any cashbacks are provided to buyers, such cash backs should also be fair and not discriminate (presumably in terms of which seller’s products these cashbacks are offered on);
(d) an e-commerce company cannot mandate any seller to only sell products on their platform.

Third, in terms of compliance, PN 2 places an additional requirement of showing compliance with the FDI Policy on the marketplace e-commerce companies that have foreign investment. This must be done by providing a certificate and report of a statutory auditor to the RBI every year.

PN 2 was published in December 2018 with no notice / consultation, and has now come into effect as of February 1, 2019. Several questions about the new conditions were raised immediately on its publication. The DIPP then issued a response to these questions soon after, noting that PN 2 is only meant to reiterate existing policies on FDI in e-commerce companies, in a manner that allows better implementation. The response also indicates that these steps have been undertaken to ensure fair, competitive and transparent business practices in the interest of consumers. At the same time, it is clear that the DIPP is reacting to specific instances of (sometimes creative) business practices undertaken by marketplaces in order to grow business while complying with the existing FDI Policy (of 2017).

Over the last few years, we’ve seen this scenario repeat itself, and most new conditions in the FDI Policy can be traced back to specific practices by big e-commerce companies that are in the retail / marketplace business, and invariably have significant amounts of FDI.

The new conditions introduced in PN 2, are sometimes vague – for instance the idea of ‘25% of inventory being purchased from the marketplace’ has been interpreted in different ways as discussed above. We will also have to see how an auditor determines whether the services offered by a marketplace to its sellers are fair and non-discriminatory, in similar circumstances, concepts that usually have a legal import, and are left to the courts to determine. Establishing what qualifies as an arm’s length transaction will also be difficult in the absence of more direction from the regulator. Given that the larger marketplaces have close to 4 lakh sellers registered, it is not clear whether the goal of better implementation will be met through this audit exercise.

Apart from the practical difficulties of implementation of the conditions and compliance with PN 2 by the retail e-commerce industry, the larger issue that this exercise raises is whether a DIPP lead FDI Policy is the place to address the issues that the FDI Policy is trying to address.

If PN 2 has been issued to ensure fair, competitive and transparent business practices in the interest of consumers, then perhaps other avenues of regulation, such as competition law, would be more effective, given that consumer interests can be affected irrespective of foreign investment a marketplace / retailer. This is a particularly interesting question, given that most well-known e-commerce marketplace entities (or groups) today have some FDI.

Data bill: Safe from prying govt?

This post first appeared in Deccan Herald on August 5, 2018

A committee of experts chaired by retired Justice BN Srikrishna was set up in July 2017 to recommend a legal framework for the protection of Indian citizens’ personal data. The committee was set up on the last day of court hearings on privacy, at a ime when Government of India was arguing that the right to privacy is not a fundamental right. This was seen by many as a last-ditch attempt by the government to convince the Supreme Court that statutory provisions are enough to protect individual privacy. The Supreme Court went ahead and reaffirmed that the right to privacy is indeed a fundamental right, and observed that in the context of personal information, this right needs to be protected by way of a statutory framework.

Ayear later, the committee has published an ambitious report, along with a draft bill, aiming not only to put in place a data protection law in India but also to set an example for other developing countries. The bill, even if enacted as is, would ensure much better protection of personal data than we have under current laws.

However, our current laws do not provide much by way of high standards to meet, and simply improving upon them is not enough. In the week since the publication of the report and bill, many problems have been picked up on already – some glaring and obvious, some that lie in the details. This has given cause for concern, in the absence of any clarity on whether there will be more public consultation on the bill or not.

Many of the issues in the committee’s report and the bill can be traced back to the terms under which it was set up – keeping in mind “the need to ensure the growth of the digital economy while keeping personal data of citizens secure and protected”

A white paper published in November 2017, with the committee’s provisional views, clearly gave precedence to the “digital economy” part of it in several ways. The data protection principles that were recommended were internationally accepted but basic. The digital economy issues were seen as unique to India. Although this position was backed by several members of the technology and other industries during the consultations, others criticised the heavy focus on the digital economy and ease of doing business.

In the report published along with the bill, the committee seems to have made an attempt to correct that, by insisting that the concepts of ‘protection of personal data’ and a ‘free and fair digital economy’ are integral to each other – almost interchangeable. As pointed out by many over the last week, the State should aim to protect personal data because it is a fundamental right, inherently available to each of us.

This exercise should be undertaken irrespective of any cause or effect relationship such actions may have with the (digital) economy – fair and free or not. However, amidst all this, something more dangerous has emerged in the report – a deference to the State when it comes to data protection obligations.

The report seems to assume a landscape where the State is a promoter of a ‘free and fair digital economy’, and the function of the law is to regulate (mostly private) use of personal data in furtherance of such an economy. While this is definitely an important aspect, the consequence of this somewhat narrow focus is that the bill excludes from its purview many uses of personal data by the State that do not have a direct focus/impact on the economy.

Most important among these are the State’s surveillance efforts. That is the collection and use of personal data in the interests of the security of the State and law enforcement (collectively referred to here as law enforcement). The bill provides an almost blanket exemption in the case of such activities, as long as they are undertaken under the provisions of constitutionally valid laws. Laws that do not currently exist/function in India.

It is true, as some have suggested, that government functions such as law enforcement and national security could, and maybe should, be governed by other, targeted laws. It is an approach taken by many other countries as well. However, this doesn’t have to be the case. A few issues to think of in this context are discussed below.

First, the government’s law enforcement function, as it relates to personal data (and much of it does), could be regulated to a large extent under the data protection law. The context and circumstances in which these law enforcement functions become imperative are larger issues, described under constitutional and other jurisprudence. However, the actual activity of collection and use of personal data could very well be addressed under a data protection law, with provisions suitably adapted for this purpose. As I have argued before, even with the bill as it stands, many procedural obligations could be made applicable to the government, to safeguard individual privacy. The focus on promoting a ‘free and fair digital economy’ gives the the committee a way out from taking this step.

Second, the need for high standards of data protection, especially in relation to surveillance activities, needs to be read in the context of historic and contemporary actions of the State.

At all levels of government, ranging from the local police to the central government, extensive surveillance activities have been undertaken over the years. In several instances, the Supreme Court has taken the government to task and put in place some minimal safeguards. However, with the rapid development of technology, and the implementation of mass surveillance programmes, these safeguards are nearly obsolete.

Edward Snowden’s revelations about US mass surveillance efforts shocked the world and have been discussed extensively in the Indian media. However, multiple research efforts suggest that India’s surveillance programmes are just as bad, if not worse. These research efforts also suggest that there is almost no oversight, transparency or accountability for surveillance activities in India.

A look at contemporary policy and law-making processes shows that there is an equally heavy focus on mandating data localisation, that is, retaining at least a copy, if not all personal data, in India. The Reserve Bank of India has already undertaken this step. The draft national e-commerce policy is said to do the same. National security and the need for law enforcement to access personal data in India easily is often considered justification enough for these exercises.

Even the committee’s report, in its discussion on data localisation, barely references any actual benefits of such an exercise in the context of personal data protection. It relies almost entirely on arguments that further the cause of domestic industry and law enforcement.

The DNA Profiling Bill, which aims to maintain a registry of DNA profiles to be used to identify individuals in criminal investigations, is set to be introduced in parliament and is controversial. The government also intended to create a social media communication hub to monitor all social media communications in India – a move that caused much concern. This plan has only now been withdrawn after a petition was filed before the Supreme Court challenging it.

In the context of all these issues, the data protection bill could have gone a few steps ahead, even if it was to just provide us more information about the way personal data could be used for surveillance.

If the committee does want to set an example for the global South, it could do more under the data protection law, without falling back on the data protection practices of the West, where established legal processes for surveillance activities exist, even if their enforcement is questionable.

Data Protection Bill: How Can Individuals Exercise Their Rights

In November 2017, the committee published a detailed white paper explaining its views on what the data protection law should look like – a comprehensive law, that applies across industries, to both the private sector and the government.

This post first appeared on News18 on July 30, 2018

On 27th of July, a committee of experts led by Justice B N Srikrishna submitted recommendations on a data protection law for India, along with a draft bill.

This committee was set up nearly a year ago, when the Supreme Court was hearing arguments about whether the right to privacy is a fundamental right. In August 2017, the Supreme Court issued a landmark judgment which upheld this right. The Court also recommended that the government put in place adequate laws to ensure that people can exercise this right.

In November 2017, the committee published a detailed white paper explaining its views on what the data protection law should look like – a comprehensive law, that applies across industries, to both the private sector and the government.

The committee called for public comments and held four public consultations to discuss issues highlighted in the white paper. Several organisations and individuals have submitted comments; however, these comments have not been published by the committee. The committee has now submitted and published its report and the bill drafted as a part of this process.

What kind of data are protected?

The committee has recommended that the law have a wide scope of application. The bill applies in relation to the use of two primary types of information – personal data and sensitive personal data.

Personal data is any data about or relating to a natural person who can be directly or indirectly identified by such data. This ‘identifiability’ could be with respect to any characteristic, trait, attribute or other feature of the identity of a person, or any combination of such features and other information. The law doesn’t apply to anonymised data i.e. data that has been modified such that it no longer identifies a person, so long as the anonymisation is irreversible.

The second category of data is ‘sensitive personal data’, a subset of personal data which calls for a higher standard of processing. This includes information like passwords, financial data, health data, biometric data, sexual orientation, official identifiers, sex life, genetic data, transgender status, intersex status, caste or tribe, and religious and political affiliations/beliefs.

If your personal data is collected, disclosed, shared or processed in India, you – a data principal – should be protected under the law. If you are an Indian company, citizen, or person/body of persons incorporated or created under Indian law, and process personal data, the law will be applicable to you – a data fiduciary. This also extends to the Indian government. If you are in India, and your personal data is processed by any person outside India, the law should be applicable to such processing as well.

How is this data protected?

As a rule, personal data can only be processed if your consent has been obtained. You must be given notice of details such as the kind of data collected, what it will be used for, how long it will be stored, whether the data will be shared/transferred to others. Your consent must be given freely, and you should be able to withdraw this consent. In cases where sensitive personal data, or personal data of children is being processed, the requirements for consent are higher.

This data can only be used for a lawful, specific purpose. Only limited data that is necessary for this purpose must be collected, and this data must only be stored for as much time as is necessary for such purpose. The data fiduciary must also ensure that the personal data they process is accurate and complete – this is particularly important in situations where the personal data is used to make decisions about an individual.

This means for example, that companies cannot collect data that is not required for the service they are providing. They also cannot deny service, if you refuse to give consent for collection of data that is not required for the service.

In addition, data fiduciaries have certain general obligations – for instance, they must maintain minimum security standards, incorporate privacy measures into the design of their technology / practices, be transparent about their data practice, and notify the authorities of any data breach.

How can individuals exercise their rights?

The bill provides three primary means by which you can exercise your rights. First, there is a set of ‘data principal rights’, second, a grievance redressal mechanism, and third like with many other laws, there is an enforcement mechanism, which provides for penalties, compensation and criminal offences.

The data principal rights include the right to confirm that a data fiduciary has your personal data, to access, and correct / update your data. It also includes the right to data portability – allowing you to transfer data between service providers, and to restrict continuing disclosure of your personal data in certain cases, i.e. the right to be forgotten.

Data fiduciaries must maintain grievance redressal mechanisms for complaints about violations. Where this doesn’t work, a complaint can be made to the adjudicating officer appointed under the data protection authority to be established under the bill.

What’s next?

The committee’s report and draft bill are a significant step towards an effective data protection regime in India. The report, in fact, is more elaborate than the bill and provides insight into what the data protection law could/should be.

However, even with the expansive individual rights discussed here, many issues have already been raised in the two days since the bill has been published.

For instance, the situations in which personal data can be processed either by private actors or the government without obtaining an individual’s consent.

These involve legitimate considerations such as the need for data in a medical emergency. But some other exceptions appear to be too broad on first impression. For example, consent is not required in some cases where an employer is processing employee data, and in many situations where the government is processing personal data. The data protection authority may also identify ‘reasonable purposes’ for which data can be processed without consent.

There are also broad exemptions from most of the law for government processing of data for security of the state and law enforcement.

It appears that between the exceptions granted to the government, and the power given to the data protection authority, there is a possibility that protection granted under the bill will be diluted, or in some cases difficult for individuals to access.

In addition, the bill also proposes some amendments to the Right to Information Act, 2005 that may provide cause for concern.

The report and the bill will now need to be submitted by the Ministry, to the Cabinet, and then to Parliament. There have already been several calls for public consultations to be held on the bill through this process, including by a member of the committee itself. A strong data protection law is necessary to protect the right to privacy as recognised by the Supreme Court. It will be important to ensure that concerns already expressed, as well as any issues that come up with more detailed examination of the report and the bill are taken into consideration at each of these stages.

The Draft Data Protection Bill is a Step Forward – but Nowhere Close to What India Needs

This post originally appeared on Scroll.in on July 29, 2018

The Justice BN Srikrishna-led Committee of Experts on Data Protection on Friday submitted its report and a draft bill to the Ministry of Electronics and Information Technology. The Committee’s exercise, although criticised for its opacity, is the most comprehensive undertaking on data protection yet, by any government agency in the country.

The committee was set up last year to “study various issues regarding data protection in India”, in the wake of the Supreme Court’s August 2017 ruling upholding the right to privacy as a fundamental right.

For months, there has been constant speculation on what these recommendations would mean for consumer data, the government’s Aadhaar programme, and the way law enforcement agencies undertake surveillance in India. With the final report and the draft bill in hand, we take a look at how the citizen-state relationship could be affected, should this Bill pass.

Notable exceptions

The draft Bill says that anyone – be it the government, a private company, citizen, a person or a body of persons – who seeks to process personal data needs to do so in a “fair and reasonable” manner that safeguards the individual’s privacy. Among other things, the Bill states that only limited personal data should be collected, for a clear, specific and lawful purpose, and individuals should be notified of the kind of data that has been collected. Barring certain situations, personal data should be collected and processed only with the explicit consent of the individual.

The Bill exempts the State from some of these obligations in two broad contexts. First are the situations where personal data needs to be processed (collected, stored, used, disclosed or shared) by the government for one of the following reasons:

  1. For the functioning of Parliament or State Legislatures
  2. For providing individuals with any service or benefit
  3. For issuing any certification, license or permit.

In the aforementioned situations, the other safeguards listed in the data protection law are largely applicable, but the government does not need to take the consent of individuals before collecting or processing their personal data.

The second involves situations where personal data needs to be processed in the interests of the security of the country, or for prevention, detection, investigation and prosecution of any violation of law. Here, the government is given a wide exemption from the data protection law. However, the data processing in these situations must be authorised by a separate law passed either by the Parliament or the State Legislature, and the processing itself should be necessary for, and proportionate to the purpose for which the data is processed.

While this may sound reasonable, the problem is that most surveillance activities currently undertaken in India, whether for national security or law enforcement, are not authorised by way of a law.

State of surveillance

The Telegraph Act, 1885 and the Information Technology Act, 2000 are the two laws that deal with surveillance. Both contain some provisions that allow for the monitoring and interception of phone calls or communications subject to certain rules. These rules only allow for targeted interception of communication in specific situations. The Information Technology Act also allows for monitoring and collection of network traffic or information in the interest of cyber security – however, this does not include surveillance of actual communication.

The Indian government, and its intelligence agencies however, run several surveillance programs that are not undertaken under these laws, and in many cases permit mass surveillance. There is very little information about these programmes, and it is not clear when, how or why any personal data is collected for these purposes. In addition to the traditional surveillance programs, we have seen several reports about the government’s controversial plans to monitor social media communications.

These activities have naturally led to several concerns regarding the privacy of individuals. Given that these surveillance and monitoring activities rely on personal information, it was hoped by many that a law that deals with protection of personal data would address some of these concerns.

The Committee’s report recognises these concerns and acknowledges that the current framework in India lacks sufficient legal and procedural safeguards for individual civil liberties. It admits that the majority of intelligence-gathering takes place outside the remit of the law and there is a lack of meaningful oversight. It discusses the need for narrow, watertight exemptions that are subject to adequate safeguards. The report also notes the need for transparency and accountability in the context of surveillance, and discusses different oversight mechanisms adopted in other countries – judicial, parliamentary, or even a combination of both.

It then goes on to recommend that the State enact a suitable law that will be applicable to intelligence or surveillance activities, provided that once such a law is enforced, all data processing for the purpose of the security of the state and law enforcement be exempt from the data protection law. This is where concerns arise.

The only provisions under the data protection law that are applicable to such actions are those requiring that the data is processed in a “fair and reasonable” manner, and that minimum security safeguards are adopted.

Much to be done

It has been argued that the role of this Committee and a data protection law is limited and cannot be used to govern intelligence and law enforcement activities. Even if this is accepted, there are certain provisions within the Bill itself that could have been extended to processing of personal data for state security and law enforcement.

For instance, the draft Bill says that data fiduciaries (the entity controlling the collection and processing of personal data) can collect and process personal data for limited purposes only and have to ensure accuracy of the data and store such data only as long as necessary. These obligations could have been made applicable even in the case of security and law enforcement activities.

The Committee’s report recognises that accountability is an important factor in ensuring that surveillance is lawful. However, this position is not reflected in the bill. For instance, the Bill requires those who are collecting data to ensure transparency and accountability through various methods including maintenance of records and regular audits. These measures could have been made applicable for security and law enforcement situations too, with modifications where necessary.

The rights of individuals to be notified when the government processes their personal data and to access that data could also be available on a conditional basis, once the surveillance activities are completed.

Some of these concerns also extend to the first category of exemptions. To begin with, the government is not required to obtain the consent of those whose data it is processing. In addition, the Bill also states that some of the transparency and accountability measures laid out in the draft need only be implemented by recognised “significant data fiduciaries”. The data protection authority may or may not categorise government agencies as “significant data fiduciaries”.

The Committee’s report recognises that the relationship between and data principals and the State involves a power imbalance. But this imbalance is furthered if the State is not required to be accountable to its citizens, especially with regard to their personal data.

Given the low bar set by existing standards, it can be argued that this is a step forward with respect to protecting rights from the surveillance apparatus in the country. However, it is not close to the kind of surveillance reform that is needed to ensure that the right to privacy is protected adequately.

The Personal Data Protection Bill, 2018

After months of speculation, the Committee of Experts on data protection (“Committee”), led by Justice B N Sri Krishna, has submitted its recommendations and a draft data protection bill to the Ministry of Electronics and Information Technology (“MEITY”) today. As we sit down for some not-so-light weekend reading to understand what our digital futures could look like if the committee’s recommendations are adopted, this series puts together a quick summary of the Personal Data Protection Bill, 2018 (“Bill”).

Scope and definitions

The Committee appears to have moved forward with the idea of a comprehensive, cross-sectoral data protection legislation that was advocated in its white paper published late last year. The Bill is meant to apply to (i) the processing of any personal data, which has been collected, disclosed, shared or otherwise processed in India; and (ii) the processing of personal data by the Indian government, any Indian company, citizen, or person / body of persons incorporated or created under Indian law. It also applies to any persons outside of India that engage in processing personal data of individuals in India. It does not apply to the processing of anonymised data.

The Bill continues to use the 2-level approach in defining the type of information that the law applies to. However, the definitions of personal data and sensitive personal data have been expanded upon significantly when compared to the definitions in our current data protection law.

Personal data includes “data about or relating to a natural person who is directly or indirectly identifiable, having regard to any characteristic, trait, attribute or any other feature of the identity of such natural person, or any combination of such features, or any combination of such features with any other information”. The move towards relying on ‘identifiability’, when read together with definitions of terms such as ‘anonymisation’, which focuses on irreversibility of anonymisation, is welcome, given that section 2 clearly states that the law will not apply in relation to anonymised data. However, the ability of data processors / the authority to identify whether an anonymisation process is irreversible in practice will need to be examined, before the authority sets out the criteria for such ‘anonymisation’.

Sensitive personal data on the other hand continues to be defined in the form of a list of different categories, albeit a much more expansive list, that now includes information such as / about official identifiers, sex life, genetic data, transgender status, intersex status, caste or tribe, and religious and political affiliations / beliefs.

Interestingly, the Committee has moved away from the use of other traditional data protection language such as data subject and data controller – instead arguing that the relationship between an individual and a person / organisation processing their data is better characterised as a fiduciary relationship. Justice Sri Krishna emphasised this issue during the press conference organised at the time of submission of the report, noting that personal data is not to be considered property.

Collection and Processing

The Bill elaborates on the notice and consent mechanisms to be adopted by ‘data fiduciaries’, and accounts for both data that is directly collected from the data principal, and data that is obtained via a third party. Notice must be given at the time of collection of personal data, and where data is not collected directly, as soon as possible. Consent must be obtained before processing.

The Committee’s earlier white paper, and the report accompanying the Bill have both discussed the pitfalls in a data protection framework that relies so heavily on consent – noting that consent is often not informed or meaningful. The report however also notes that it may not be feasible to do away with consent altogether, and tries to address this issue by way of adopting higher standards for consent, and purpose limitation. The Bill also provides that consent is to be only one of the grounds for processing of personal data. However, this seems to result in some catch-all provisions allowing processing for ‘reasonable purposes’. While it appears that these reasonable purposes may need to be pre-determined by the data protection authority, the impact of this section will need to be examined in greater detail. The other such wide provision in this context seems to allow the State to process data – another provision that will need more examination.

Sensitive personal data

Higher standards have been proposed for the processing of sensitive personal data, as well as personal / sensitive personal data of children. The emphasis on the effect of processing of certain types of data, keeping in mind factors such as the harm caused to a ‘discernible class of persons’, or even the provision of counselling or child protection services in these sections is welcome. However, there remains a wide provision allowing for the State to process sensitive personal data (of adults), which could be cause for concern.

Rights of data principals

The Bill also proposes 4 sets of rights for data principals: the right to confirmation and access, the right to correction, the right to data portability, and the right to be forgotten. There appears to be no right to erasure of data, apart from a general obligation on the data fiduciary to delete data once the purpose for collection / processing of data has been met. The Bill proposes certain procedural requirements to be met by the data principal exercising these rights – an issue which some have already pointed out may be cause for concern.

Transparency and accountability

The Bill requires all data fiduciaries to adopt privacy by design, transparency and security measures.

Each data fiduciary is required to appoint a data protection officer, conduct data protection impact assessments before the adoption of certain types of processing, maintain records of data processing, and conduct regular data protection audits. These obligations are applicable to those notified as ‘significant data fiduciaries’, depending on criteria such as the volume and sensitivity of personal data processed, the risk of harm, the use of new technology, and the turnover of the data fiduciary.

The requirements for data protection impact assessments is interesting – an impact assessment must be conducted before a fiduciary undertakes any processing involving new technologies, or large scale profiling or use of sensitive personal data such as genetic or biometric data (or any other data processing which carries a risk of significant harm to data principals). If the data protection authority thinks that such processing may cause harm (based on the assessment), they may direct the fiduciary to cease such processing, or impose conditions on the processing. The language here implies that these requirements could be applicable to processing by the State / private actors, where new technology is used in relation to Aadhaar, among other things. However, as mentioned above, this will be subject to the data fiduciary in question being notified as a ‘significant data fiduciary’.

In a welcome move, the Bill also provides a process for notification in the case of a breach of personal data by data fiduciaries. However, this requirement is limited to notifying the data protection authority, which then decides whether there is a need to notify the data principal involved. It is unfortunate that the Committee has chosen to limit the rights of data principals in this regard, making them rely instead on the authority to even be notified of a breach that could potentially harm them.

Cross border transfer of data

In what has already become a controversial move, the Bill proposes that at least one copy of all personal data under the law, should be stored on a server or data centre located in India. In addition, the central government (not the data protection authority) may notify additional categories of data that are ‘critical’ and should be stored only in India.

Barring exceptions in the case of health / emergency services, and transfers to specific international organisations, all transfer of personal data outside India will be subject to the approval of the data protection authority, and in most cases, consent of the data principal.

This approval may be in the form of approval of standard contractual clauses applicable to the transfer, or a blanket approval of transfers to a particular country / sector within a country.

This provision is ostensibly in the interest of the data principals, and works towards ensuring a minimum standard of data protection. The protection of the data principal under this provision, like many other provisions, including those relating to data breach notifications to the data principal, will be subject to the proper functioning of the data protection authority. In the past, we have seen that simple steps such as notification of security standards under the Information Technology (Reasonable security practices and procedures and sensitive personal data or information) Rules, 2011, have not been undertaken for years.

In the next post in this series, we will discuss the functions of the authority, and other provisions in the Bill, including the exemptions granted, and penalties and remedies provided for.

Dealing With Revenge Porn in India

Also see: Legal Remedies for Online Harrasment in India

By Yesha Paul

In March of 2018, a man in West Bengal was sentenced to five years imprisonment and fined Rs 9,000 for uploading private pictures and videos of a girl without her consent as revenge for ending their relationship. Under the promise of marriage, the accused pressured the complainant into providing explicit images of herself, and leveraged his threats to upload these pictures onto social media to acquire more pictures. Later, he accessed her phone without her knowledge to retrieve more private pictures and videos. When the complainant refused to continue their relationship, he uploaded this material onto a popular pornographic website along with both her and her father’s names. In addition to the defendant’s imprisonment and fine, the state government was directed to treat the victim as a survivor of rape and grant appropriate compensation. With evidence provided by service providers Reliance Jio and Google, the perpetrator was convicted under Sections 354A, 354C, 354 and 509 of the IPC as well as Sections 66E, 66C, 67 and 67A of the IT Act, in what is likely the first revenge porn conviction in India.

Revenge porn is a form of non-consensual pornography that came to international attention with the 2010 launch (and subsequent 2012 takedown) of the popular website IsAnyoneUp, which allowed users to upload nude photographs. While a number of these images were ostensibly self-submitted, many were revealed to have been submitted by angry ex-lovers, which would amount to ‘revenge porn’. Compounding the issue was the fact that these explicit images deliberately linked to the social media profiles of the person in the image.

According to Halder and Jaishankar, the essential elements of revenge porn are that the perpetrator and the victim shared an intimate relationship, and that the former has deliberately (and without the victim’s consent) released sexually explicit information online in order to cause distress and harm to the victim’s reputation.

While revenge porn is often used interchangeably with the term “non-consensual pornography”, it is distinct from other forms of non-consensual pornography such as rape videos, morphing or voyeurism. For instance, non-consensual pornography includes within its ambit sexually explicit images captured without a person’s knowledge or consent. However, revenge porn often includes such sensitive information that has voluntarily been captured or sent to the perpetrator in good faith in the course of an intimate relationship. Further, unlike in the case of revenge porn, not all perpetrators of non-consensual pornography are motivated by personal feelings such as revenge (as in the case of hackers who released intimate photos of more than 100 female celebrities after gaining access to their private iCloud accounts).

As a result, researchers are moving away from the term “revenge porn” as it can be somewhat misleading. “Revenge” limits the scope of this offence to motivations of personal vengeance, whereas such an act could be motivated by a desire for profit, notoriety, entertainment, or no reason at all. “Porn” implies that all images of nudity are intrinsically pornographic. Sexually explicit images created and shared within a private relationship should not be considered pornographic, unless they are distributed without consent, as this results in a private image being converted into public sexual entertainment. Accordingly, many victim advocates prefer to use the term “non-consensual pornography” or non-consensual sharing of intimate images.

Although the National Crime Records Bureau documents cyber-crimes against women, there are no official statistics available that pertain specifically to revenge porn in India. A 2010 report suggests that “only 35 per cent of the women have reported about their victimization, 46.7 per cent have not reported and 18.3 per cent have been unaware of the fact that they have been victimized … women prefer not to report about their victimization owing to social issues.” Victim-shaming (both by the criminal justice system and the public at large) is common, and the potential social fallout often extends to the victim’s family as well.

The recent surfeit of revenge porn has prompted many countries to enact legislation that criminalises it. These include the UK, many states in the USA, Canada, Australia, Japan and the Philippines.

At present however, there are no legal provisions that directly address revenge porn in India. While certain sections in the Indian Penal Code (IPC) and Information Technology (IT) Act can be invoked by victims, they fail to fully encompass the complexity of such cases and do not specifically target non-consensual pornography published online.

Section 354C of the IPC makes voyeurism punishable, and Explanation 2 to the Section deals with the non-consensual dissemination of consensually-captured images. However, this section limits its scope to female victims and male offenders.

In cases of non-consensual pornography (particularly those that involve morphing), victims can also seek recourse under Section 499 of the IPC for criminal defamation.

Section 66E of the IT Act punishes the transmission of images depicting the private areas of a person. The Explanation to the section limits private area to “… the naked or undergarment clad genitals, pubic area, buttocks or female breast”. This provision is gender-neutral and captures every aspect of revenge porn while not addressing it by name. However, the narrow definition of “private areas” in this case could limit the applicability of the act in cases where the victim is captured in an intimate position without showing those particular areas.

Section 67A of the IT Act punishes publication or transmission of “material containing sexually explicit acts, etc. in electronic form”. While this can effectively punish perpetrators, it also risks including within its ambit, victims who may have voluntarily captured and shared such private content with their partners.

The recent Supreme Court judgment recognising privacy as a fundamental right could have substantial implications on revenge porn and non-consensual pornography in general, in light of arguments recognising the right to bodily integrity. Copyright law could also potentially be used by victims, particularly when the content is a selfie. By claiming a violation of their copyright, a victim could potentially get such material taken down. While Indian copyright law does not presently provide any relief to victims of revenge porn, some victims in the US have successfully enforced their copyright to get such images taken down.

Social media platforms are often used to disseminate such content. Until recently, their role was limited to removing non-consensual pornography and other offensive images. However, there have been calls for them to play a more active role and filter this content before it is uploaded. Facebook has attempted to prevent the re-posting of revenge porn by cataloguing content which had been previously reported as revenge porn on its site.

The gender disparity in victims of non-consensual pornography is a reflection of the hostility still faced by women on the internet today. Involuntary porn today can be considered “the sexual harassment of 20 years ago. It’s an easy way to punish women for behaving in ways that they don’t approve of – for leaving them, for rejecting them, for being happy without them.”

Also seeLegal Remedies for Online Harrasment in India

Yesha is a Programme Officer at the Centre for Communication Governance at National Law University Delhi

Untangle The Privacy Law

An overview of India’s efforts to frame the data protection Bill and what has been achieved so far

his post first appeared on Business World on May 06, 2018

Over the last week of july, the right to privacy and data protection has once again made headlines in India. On 29 July, reports stated that the proprietors of three small IT companies were arrested in connection with the leak of personal information of over 8 lakh students.

A day before that, on 28 July, the chairman of the Telecom Regulatory Authority of India (TRAI), published his Aadhaar number, challenging security researchers to show what harm could be done to him using this number. There has been much debate on the Aadhaar project and its impact on privacy over the past few years. This “challenge” was apparently issued to show that “Aadhaar does not contribute to increasing any of your other digital vulnerabilities”.

Researchers were able to find several pieces of personal information about the TRAI chairman, while pointing out that availability of such information may not harm him, but could be harmful to others.  

On 27 July, the long-awaited draft data protection bill and an accompanying report were submitted to the Ministry of Electronics and Information Technology (MEITY) by a committee of experts.  
The incidences of data leaks, and the “challenge” issued by a senior member of government go to show the state of disarray that we see in the general discourse on the value of technology and data. On the other hand, the data protection bill brought with it hope that we could soon be in the leagues of countries that actively work towards protecting the privacy of their people – an important objective at a time when the constant emphasis for rapid ‘innovation’ seems to disregard the need to plan for the long term protection of individuals and their rights.

Right to privacy and data protection

The disparity seen above is reflected in the state of legal protections currently afforded to privacy and data protection in India.

The most comprehensive data protection rules we have currently fall under the Information Technology Act, 2000. The rules, applicable since 2011, are, in many instances, inconsistent, and are difficult to enforce. In addition, we also have specific provisions applicable to certain sectors, for instance, the financial sector and the telecom sectors.

In August 2017, the Supreme Court affirmed that the right to privacy is a fundamental right in its judgement in the Puttaswamy case. The Court rejected the Indian government’s argument that privacy is not a fundamental right. It also recommended that the government put into place a statutory framework to protect this right as it relates to personal information.

Committee of experts

In the meantime, MEITY set up a committee of experts chaired by former Supreme Court Judge B.N. Srikrishna. The committee was asked to recommend principles to be considered for data protection in India and suggest a draft data protection bill.

In the wake of criticism regarding the Constitution, as well as lack of transparency on the functioning of the committee, a white paper with the committee’s provisional views was published in late November 2017. Public comments were called for, and consultations were held to discuss issues raised in the white paper in January 2018. The committee has not provided any information regarding the number or nature of comments received.

With privacy issues such as the Facebook-Cambridge Analytica scandal dominating the news cycle over the past few months, the committee’s reports have been waited on with much anticipation.

The personal data protection Bill, 2018
The bill as recommended by the committee, is applicable to the collection and processing of personal data by the State and the private sector.

It adopts an approach similar in structure to the European Union’s General Data Protection Regulation (GDPR). The GDPR, which came into effect May, found its final form after several years of debate, and is largely considered to be one of the most comprehensive data protection laws in effect today.

Several countries have adopted laws that take guidance from the GDPR, and it is fast becoming the standard for data protection laws across the world.

In this context, the structure of the bill is a welcome move, since it aims to use high benchmarks to protect the data of individuals in India, while also adopting international standards that most data processors who work across borders are subject to.

The bill protects a wide range of information that can be used to identify individuals, calling for a higher standard of protection where information is sensitive, and can be potentially used to cause greater harm. Individuals must be notified of the collection and processing of their personal data, in detail, and consent must be obtained for the processing of data for specific purposes. Individuals have the right to withdraw their consent, access and correct / update personal data held by processors. The bill also proposes transparency, and accountability measures to be adopted by data processors.

However, while the bill ostensibly enables individuals to exercise their right to privacy against the State and the private sector, there are several concerns in this regard.

It grants broad exceptions to the State, in some cases in the context of specific obligations such as the requirement for individuals’ consent. In other cases, State action is almost entirely exempted from obligations under the law. In the context of the private sector, while the obligations in relation to data processing have been articulated in detail, several provisions allow for the possibility of making compliance with these obligations difficult to assess or enforce.

Sectoral regulators and data protection
The Justice Srikrishna Committee’s work with MEITY is one of several government processes that are now addressing issues relevant to personal data, among other things.

Some of the other important processes are the TRAI’s recommendations on privacy, security and ownership of data in the telecom sector, and the Ministry of Health & Family welfare’s draft Digital Information Security in Healthcare Act (DISHA). The RBI’s recent rules mandating data localisation are relevant. protection. Reports also suggest that the upcoming national e-commerce policy framework will follow in this direction.

The TRAI’s recommendations touch upon a wide range of data protection issues, as relevant to the telecom sector. Interestingly, the TRAI has chosen to address telecom service providers and all actors in the digital ecosystem for instance, providers of devices, browsers, operating systems, and over the top services. Many of these service providers are outside the reach of the TRAI, or even the Department of Telecommunication’s powers.

DISHA aims to regulate the processes related to digital health data, and ‘reliability, data privacy, confidentiality and security of digital health data’. This is to be done by the national and state e-health authorities to be set up under this act.

The TRAI recommendations and DISHA touch upon a wide range of data protection issues, while providing little detail on actual compliance.

It is not clear whether the Justice Srikrishna committee has taken these processes into account – there is no mention of the TRAI recommendations in the report. DISHA, is only once referred to as an example of a sector specific law that imposes a higher standard – without any analysis regarding the difference in the two laws. There does not seem to be any mention of the RBI’s data localisation directive either. However, many of the issues touched upon in these frameworks are reflected in the committee’s report, some such as data localisation by their controversial presence, and others such as the idea of ownership of personal data, by intentional absence.

In the days following the its submission, it has become apparent that although the direction that the bill takes in terms of protection of personal data is deserving of some praise, there is much work to be done. Several issues – ranging from typographical errors, and inconsistencies in drafting, to conceptual problems affecting constitutionally guaranteed fundamental rights – need to be addressed. 

At the press conference held during the submission of the report and bill, the IT Minister has stated that extensive consultations will be held. It is important that these consultations be as inclusive as possible. While many issues have been raised by different stakeholders, finding the right solutions will require collaboration.

The final form that this bill takes could very well define the way we, as individuals, interact with companies and government agencies, across the board. It is important for each of us, that we get this right.