Cyber Diplomacy: Towards A New Cybersecurity Strategy

Cyber space has become a focal point of international relations. With most global powers having realized that cyber security is integral to their national security, cyber issues have found their place in foreign policy, resulting in the emergence of cyber diplomacy.

Cyber diplomacy is the use of traditional diplomatic tools including negotiations, formation of alliances, treaties, and agreements to resolve issues that arise in cyber space. The United Nations Group of Governmental Experts on Developments in the Field of Information and Telecommunications in the Context of International Security (UN GGE) is one of the most high profile cyber diplomacy exercises at the global level. The UN GGE was formed subsequent to the adoption of digital security as an UN agenda, to examine threats emanating from cyberspace and to develop appropriate cooperative measures to address them. Several multilateral organizations such as NATO, ASEAN, BRICS, to name a few, are also increasingly serving as platforms for cyber diplomacy. The post will briefly explore the role of cyber diplomacy in enabling cybersecurity by analyzing the relevance of a few major cyber diplomacy efforts in developing a sustainable and stable cyberspace.

The Role of Cyber Diplomacy

Society’s increasing reliance on internet and digital technologies is accompanied with security challenges in the form of various malicious activities including hacking, espionage, cyber attacks, and cyber war. These challenges arise from a domain that lacks a formal, institutionalized regime to regulate and oversee the conduct of the actors. Unless there is a global consensus on regulating cyberspace, the potential to wreak havoc remains unbridled. Considering the transnational nature of cyberspace, a secure cyber environment can be established only through global engagement, dialogue, and cooperation, making cyber diplomacy the only possible means to achieve this goal. Diplomatic efforts to stabilize cyberspace have primarily focused on three areas: establishment of cyber norms, confidence building measures (CBMs), and capacity building.

Norms in Cyberspace

The increasing exploitation of cyberspace by states for political and military objectives mandates the need for norms that would lay down what states can and cannot do online. Cyber norms are voluntary guidelines adopted by the states that would promote stability in cyberspace. Establishing these norms would help in developing a shared understanding among states on how to work together in matters of mutual concern. Also, continued observation of these norms created through practice or formal agreements will help them gain legitimacy amongst other states gradually resulting in their evolution into international law. The norm suggesting that cyber enabled theft of intellectual property for commercial gain is unacceptable developed as a result of a US-China bilateral agreement, and is an example of a successful norm that has gradually gained recognition amongst other states and the G20.

Norms are non-binding guidelines for the conduct of relevant actors, with an element of good faith commitment and limited consequences in the event of non-compliance. Treaties, on the other hand, are binding agreements that are readily enforceable. Although norms seem weaker than treaties, they can have a powerful impact. When nuclear weapons were developed, they were simply considered a more powerful form of traditional weapons until norms against their use developed, making their use unthinkable in ordinary circumstances. Creating norms could, over time, help in establishing benchmarks for acceptable behavior in cyber domain.

Challenges to Norm Creation

Developing cybersecurity norms is extremely challenging due to the unique nature of cyberspace, diverse interests of the parties, and the broad scope of issues involved. The use of contrasting terms – cyber security and information security – by the US and its allies and the Sino-Russian bloc respectively indicates the difference in what is perceived as a threat by the groups. While the former focuses on the protection of data and hardware from unauthorized access, the latter focuses on the content of the information, which goes against the idea of Open Internet advocated by the former. Unless these radically incompatible perceptions on the very concept of security in cyberspace are reconciled, the process of norm creation is likely to be stalled.

Confidence Building Measures in Cyberspace

While norms help in establishing acceptable behavior in cyberspace, the difficulty in forming cyber norms calls for an alternative means to diffuse distrust and misunderstandings among states. CBMs have emerged as the solution. CBMs are measures adopted at regional and global levels that enhance transparency and facilitate exchange of information, which would help states to assess each other’s activities and understand their intentions and thereby reduce the risk of a cyber war. For instance, the practice of transparency enables states to distinguish between defensive and offensive cyber investments by enhancing situational awareness and building common understanding.

Furthermore, CBMs are instrumental in ensuring effective compliance with norms. The norm according to which states should not knowingly allow their territories to be used for unlawful acts using information and communication technologies (ICTs) requires states to employ all their instruments to ensure this. However proving such knowledge is difficult. In such instances, information exchange and cooperation during investigations helps in determining compliance. Such CBMs also aid states in implementing the norm by enhancing capacity. In the absence of CBMs, cyber norms will merely provide an illusion of stability.

Capacity Building in Cyberspace

All states do not stand on an equal footing in terms of their cyber capacities, especially new entrants to the cyber domain. However it is necessary to ensure that all states have at least the baseline capacity that would enable them to participate in the development and implementation of norms and CBMs and to protect their critical information infrastructure. The UN GGE 2015 also recognized the link between compliance with norms and CBMs and capacity building. Cyber diplomacy can help in enhancing the human, institutional, technological and legal capacities of states through formal and informal agreements.

The Way Forward

Development of cyber norms has proven to be difficult. With the breakdown of the UN GGE, the only venue that brought together the Sino-Russian and the Western blocs for norm discussion, prospects for the formation of norms in the near future appear to be slim.

CBMs seem to be the most promising avenue to establish stability in the cyber domain since they do not require the states to agree on a shared set of principles, but instead focus on fostering cooperation despite the differences as states have a shared interest to establish stability. Bilateral engagements amongst states would be the ideal platform to deepen cooperation and establish CBMs. A few of the more successful bilateral agreements between the opposing global powers have resulted in the development of effective CBMs such as real time communication and assistance to compensate for limited trust.

With effective implementation of CBMs, there is hope for gradual development of norms, by establishing trust and eliminating misunderstandings, and thereby a safe and secure cyberspace.

Advertisements

The Proposed Regulation of DNA Profiling Raises Critical Privacy Concerns

The Union Cabinet recently approved the DNA Technology (Use and Application) Regulation Bill, 2018 (“DNA Profiling Bill”), which is scheduled to be introduced in Parliament today (31st July). The Bill is largely based on the 2017 Law Commission Report on “Human DNA Profiling – A draft Bill for the Use and Regulation of DNA-Based Technology”, which seeks to expand “the application of DNA-based forensic technologies to support and strengthen the justice delivery system of the country.

Apart from identifying suspects and maintaining a registry of offenders, the Bill seeks to enable cross-matching between missing persons and unidentified dead bodies, and establishing victim identity in mass disasters.

Features of the Bill:

The Bill envisages the setting up of a DNA profiling board which shall function as the regulatory authority and lay down guidelines, standards and procedures for the functioning of DNA laboratories and grant them accreditation. The board will also assist the government in setting up new data banks and advise the government on “all issues relating to DNA laboratories”. In addition, it will make recommendations on legislation and practices relating to privacy issues around storage and access to DNA samples.  

DNA data banks will also be established, consisting of a national data bank as well as the required number of regional data banks. Regional data banks must mandatorily share all their information with the national data bank. Every data bank shall maintain databases of five categories of data – crime scenes, suspects or undertrials, offenders, missing persons, and unknown deceased persons.

The 2017 draft has made significant changes to address concerns raised about the previous 2015 draft. These include removing the index of voluntarily submitted DNA profiles, deleting the provision allowing the DNA profiling board to create any other index as necessary, detailing serious offences for DNA collection, divesting the database manager of discretionary powers, and introducing redressal mechanisms by allowing any aggrieved person to approach the courts. Additionally, it has added legislative provisions authorising licensed laboratories, police stations and courts to collect and analyse DNA from certain categories of people, store it in data banks and use it to identify missing/ unidentified persons and as evidence during trial.

The new Bill has attempted to address previous concerns by limiting the purpose of DNA profiling, stating that it shall be undertaken exclusively for identification of a person and not to extract any other information. Safeguards have been put in place against misuse in the form of punishments for disclosure to unauthorised persons.

The Bill mandates consent of an accused before collection of bodily substances for offences other than specified. However, any refusal, if considered to be without good cause, can be disregarded by a Magistrate if there is reasonable cause to believe that such substances can prove or disprove guilt. Any person present during commission of a crime, questioned regarding a crime, or seeking a missing family member, may volunteer in writing to provide bodily substances. The collection of substances from minors and disabled persons requires the written consent of their parents or guardians. Collection from victims or relatives of missing persons requires the written consent of the victim or relative. Details of persons who are not offenders or suspects in a crime cannot be compared to the offenders’ or suspects’ index, and any communication of details can only be to authorised persons.

Areas of Concern:

Although the Bill claims that DNA testing is 99.9% foolproof, doubts have recently been raised about the possibility of a higher error rate than previously claimed. This highlights the need for the proposed legislation to provide safeguards in the event of error or abuse.

The issue of security of all the data concentrated in data banks is of paramount importance in light of its value to both government and private entities. The Bill fails to clearly spell out restrictions or to specify who has access to these data banks.

Previous iterations of the Bill have prompted civil society to express their reservations about the circumstances under which DNA can be collected, issues of consent to collection, access to and retention of data, and whether such information can be exploited for purposes beyond those envisaged in the legislation. As in the case of Aadhaar, important questions arise regarding how such valuable genetic information will be safeguarded against theft or contamination, and to what extent this information can be accessed by different agencies. The present Bill has reduced the number of CODIS loci that can be processed from 17 to 13, thus restricting identification only to the necessary extent. However, this provision has not been explicitly stated in the provisions of the legislation itself, casting doubt over the manner in which it will be implemented.

Written consent is mandatory before obtaining a DNA sample, however withholding of consent can be overruled by a Magistrate if deemed necessary. An individual’s DNA profile can only be compared against crime scene, missing person or unknown deceased person indices. A court order is required to expunge the profile of an undertrial or a suspect, whose profile can also be removed after filing of a police report. Any person who is not a suspect or a convicted offender can only have their profile removed on a written petition to the director of the data bank. The consent clause is also waived if a person has been accused of a crime punishable either by death or more than seven years in prison. However, the Bill is silent on how such a person’s profile is to be removed on acquittal.

Moreover, the Bill states that “the information contained in the crime scene index shall be retained”. The crime scene index captures a much wider data set as compared to the offenders’ index, since it includes all DNA evidence found around the crime scene, on the victim, or on any person who may be associated with the crime. The indefinite retention of most of these categories of data is unnecessary, as well as contrary to earlier provisions that provide for such data to be expunged. However, the government has claimed that such information will be removed “subject to judicial orders”. Importantly, the Bill does not contain a sunset provision that would ensure that records are automatically expunged after a prescribed period.

While the Bill provides strict penalties for deliberate tampering or contamination of biological evidence, the actual mechanisms for carrying out quality control and analysis have been left out of the parent legislation and left to the purview of the rules.

Crucially, the Bill has not explicitly defined privacy and security protections such as implementation of safeguards, use and dissemination of genetic information, security and confidentiality and other privacy concerns within the legislation itself – leaving such considerations to the purview of regulation (and out of parliamentary oversight). The recently released Personal Data Protection Bill, 2018 does little to allay these concerns. As per this Bill, DNA Banks will be classified as significant data fiduciaries, and thus subject to audits, data protection impact assessments, and appointment of a special data protection officer. However, although genetic information is classified as sensitive personal data, the Data Protection Bill does not provide sufficient safeguards against the processing of such data by the State. In light of the proposed data protection framework, and the Supreme Court confirming that the right to privacy (including the right to bodily integrity) is a fundamental right, the DNA Profiling Bill as it stands in its present form cannot be implemented without violating the fundamental right to privacy.

The Personal Data Protection Bill, 2018

After months of speculation, the Committee of Experts on data protection (“Committee”), led by Justice B N Sri Krishna, has submitted its recommendations and a draft data protection bill to the Ministry of Electronics and Information Technology (“MEITY”) today. As we sit down for some not-so-light weekend reading to understand what our digital futures could look like if the committee’s recommendations are adopted, this series puts together a quick summary of the Personal Data Protection Bill, 2018 (“Bill”).

Scope and definitions

The Committee appears to have moved forward with the idea of a comprehensive, cross-sectoral data protection legislation that was advocated in its white paper published late last year. The Bill is meant to apply to (i) the processing of any personal data, which has been collected, disclosed, shared or otherwise processed in India; and (ii) the processing of personal data by the Indian government, any Indian company, citizen, or person / body of persons incorporated or created under Indian law. It also applies to any persons outside of India that engage in processing personal data of individuals in India. It does not apply to the processing of anonymised data.

The Bill continues to use the 2-level approach in defining the type of information that the law applies to. However, the definitions of personal data and sensitive personal data have been expanded upon significantly when compared to the definitions in our current data protection law.

Personal data includes “data about or relating to a natural person who is directly or indirectly identifiable, having regard to any characteristic, trait, attribute or any other feature of the identity of such natural person, or any combination of such features, or any combination of such features with any other information”. The move towards relying on ‘identifiability’, when read together with definitions of terms such as ‘anonymisation’, which focuses on irreversibility of anonymisation, is welcome, given that section 2 clearly states that the law will not apply in relation to anonymised data. However, the ability of data processors / the authority to identify whether an anonymisation process is irreversible in practice will need to be examined, before the authority sets out the criteria for such ‘anonymisation’.

Sensitive personal data on the other hand continues to be defined in the form of a list of different categories, albeit a much more expansive list, that now includes information such as / about official identifiers, sex life, genetic data, transgender status, intersex status, caste or tribe, and religious and political affiliations / beliefs.

Interestingly, the Committee has moved away from the use of other traditional data protection language such as data subject and data controller – instead arguing that the relationship between an individual and a person / organisation processing their data is better characterised as a fiduciary relationship. Justice Sri Krishna emphasised this issue during the press conference organised at the time of submission of the report, noting that personal data is not to be considered property.

Collection and Processing

The Bill elaborates on the notice and consent mechanisms to be adopted by ‘data fiduciaries’, and accounts for both data that is directly collected from the data principal, and data that is obtained via a third party. Notice must be given at the time of collection of personal data, and where data is not collected directly, as soon as possible. Consent must be obtained before processing.

The Committee’s earlier white paper, and the report accompanying the Bill have both discussed the pitfalls in a data protection framework that relies so heavily on consent – noting that consent is often not informed or meaningful. The report however also notes that it may not be feasible to do away with consent altogether, and tries to address this issue by way of adopting higher standards for consent, and purpose limitation. The Bill also provides that consent is to be only one of the grounds for processing of personal data. However, this seems to result in some catch-all provisions allowing processing for ‘reasonable purposes’. While it appears that these reasonable purposes may need to be pre-determined by the data protection authority, the impact of this section will need to be examined in greater detail. The other such wide provision in this context seems to allow the State to process data – another provision that will need more examination.

Sensitive personal data

Higher standards have been proposed for the processing of sensitive personal data, as well as personal / sensitive personal data of children. The emphasis on the effect of processing of certain types of data, keeping in mind factors such as the harm caused to a ‘discernible class of persons’, or even the provision of counselling or child protection services in these sections is welcome. However, there remains a wide provision allowing for the State to process sensitive personal data (of adults), which could be cause for concern.

Rights of data principals

The Bill also proposes 4 sets of rights for data principals: the right to confirmation and access, the right to correction, the right to data portability, and the right to be forgotten. There appears to be no right to erasure of data, apart from a general obligation on the data fiduciary to delete data once the purpose for collection / processing of data has been met. The Bill proposes certain procedural requirements to be met by the data principal exercising these rights – an issue which some have already pointed out may be cause for concern.

Transparency and accountability

The Bill requires all data fiduciaries to adopt privacy by design, transparency and security measures.

Each data fiduciary is required to appoint a data protection officer, conduct data protection impact assessments before the adoption of certain types of processing, maintain records of data processing, and conduct regular data protection audits. These obligations are applicable to those notified as ‘significant data fiduciaries’, depending on criteria such as the volume and sensitivity of personal data processed, the risk of harm, the use of new technology, and the turnover of the data fiduciary.

The requirements for data protection impact assessments is interesting – an impact assessment must be conducted before a fiduciary undertakes any processing involving new technologies, or large scale profiling or use of sensitive personal data such as genetic or biometric data (or any other data processing which carries a risk of significant harm to data principals). If the data protection authority thinks that such processing may cause harm (based on the assessment), they may direct the fiduciary to cease such processing, or impose conditions on the processing. The language here implies that these requirements could be applicable to processing by the State / private actors, where new technology is used in relation to Aadhaar, among other things. However, as mentioned above, this will be subject to the data fiduciary in question being notified as a ‘significant data fiduciary’.

In a welcome move, the Bill also provides a process for notification in the case of a breach of personal data by data fiduciaries. However, this requirement is limited to notifying the data protection authority, which then decides whether there is a need to notify the data principal involved. It is unfortunate that the Committee has chosen to limit the rights of data principals in this regard, making them rely instead on the authority to even be notified of a breach that could potentially harm them.

Cross border transfer of data

In what has already become a controversial move, the Bill proposes that at least one copy of all personal data under the law, should be stored on a server or data centre located in India. In addition, the central government (not the data protection authority) may notify additional categories of data that are ‘critical’ and should be stored only in India.

Barring exceptions in the case of health / emergency services, and transfers to specific international organisations, all transfer of personal data outside India will be subject to the approval of the data protection authority, and in most cases, consent of the data principal.

This approval may be in the form of approval of standard contractual clauses applicable to the transfer, or a blanket approval of transfers to a particular country / sector within a country.

This provision is ostensibly in the interest of the data principals, and works towards ensuring a minimum standard of data protection. The protection of the data principal under this provision, like many other provisions, including those relating to data breach notifications to the data principal, will be subject to the proper functioning of the data protection authority. In the past, we have seen that simple steps such as notification of security standards under the Information Technology (Reasonable security practices and procedures and sensitive personal data or information) Rules, 2011, have not been undertaken for years.

In the next post in this series, we will discuss the functions of the authority, and other provisions in the Bill, including the exemptions granted, and penalties and remedies provided for.

India’s Artificial Intelligence Roadmap

There is now a near universal perception that Artificial Intelligence technologies are set to disrupt every sphere of life. However, this is coupled with concern regarding the social, ethical (and even existential) challenges that AI might present. As a consequence, there has been an uptake in interest by governments on how best to marshal the development of these technologies. The United Kingdom, the United States, China, and France, among others, have all released vision documents that explore these themes.

This post, the first in a series, presents a brief overview of such initiatives by the Indian government. Subsequent posts will focus specifically on their treatment of personal data, as well as their consideration of ethical issues posed by AI.

~

Task Force on Artificial Intelligence

In August 2017, the Ministry of Commerce and Industry set up a ‘Task Force on Artificial Intelligence for India’s Economic Transformation’. A panel of 18 members was formed with the objective of exploring how Artificial Intelligence could be best deployed in India.

The Task Force released its Report in May 2018, where it characterized AI as a ‘socio-economic problem solver at a large scale’, rather than simply a booster for economic growth. It sought to explore domains which would benefit from government intervention, with the objective of improving quality of life, and generating employment. The report identifies 10 sectors where AI could be deployed – Manufacturing, FinTech, Healthcare, Agriculture and Food Processing, Retail, Accessibility Technology, Environment, National Security and Public Utility Services. It attempts to identify challenges specific to each sector, as well as enabling factors that could promote the adoption of AI.

The report also explores the predicted impact of AI on employment, as well as other broader social and ethical implications of the technology. It concludes with a set of recommendations for the government of India. A primary recommendation is to constitute an Inter-Ministerial National Artificial Intelligence Mission (N-AIM) with a 5 year budget of Rs. 1200 Crores. Other recommendations focus on creating an ecosystem for better availability of data for AI applications; skilling and education initiatives focused on AI; standard setting, as well as international participation in standard setting processes.

NITI Aayog’s National Strategy for Artificial Intelligence

In his Budget speech, the Finance Minister had tasked the NITI Aayog with formulating a national programme for Artificial Intelligence. In June 2018, the NITI Aayog released its roadmap in the form of the National Strategy for Artificial in India.

The paper frames India’s AI ambitions in terms of increasing economic growth, social development, and as an incubator for technology that can cater to other emerging economies. It focuses on 5 sectors as avenues for AI led intervention. These are healthcare, agriculture, education, smart cities, and smart mobility. It also identifies some key challenges to the effective adoption of AI. These include low awareness, research, and expertise in AI along with an absence of collaboration; the lack of ecosystems that enable access to usable data; high resource costs; and ill-adapted regulations.

The paper then presents a series of recommendations to address some of these issues. In order to expand AI research in India, it proposes a two-tier framework to focus on basic research as well as application based research. It also proposes the creation of a common computing platform in order to pool cloud infrastructure, and reduce infrastructural requirements for such institutions. It further suggests a review of the intellectual property framework to enable greater AI innovation. In order to foster international collaboration, the paper proposes the creation of a supranational CERN-like entity for AI. It also recommends skilling and education initiatives to address job creation, as well as the current lack of AI expertise. In order to accelerate adoption, it proposes a platform for sharing government datasets, along with a marketplace model for data collection and aggregation, for data annotation, as well as for deployable AI models.

The paper concludes with its recommendations for ‘responsible’ AI development. It recommends that there be a consortium of the Ethics Councils at each of the AI research institutions. It further proposes the creation of a Centre for Studies on Technology Sustainability. It also emphasizes the importance of fostering research on privacy preserving technology, along with general and sectoral privacy regulations.

Further reports suggest that a task force will be set up to execute the proposals that have been made, in coordination with the relevant ministries.

MeitY Committees

It has also been reported that four committees have been constituted in February 2018 to deliberate on issues of ‘data for AI, applications of AI, skilling and cyber security/legal, ethical issues.’ However, there have been no reports about when the committees will present their recommendations, and  whether they will be made available to the public.

~

India appears to be at the nascent stage of formulating its approach towards Artificial Intelligence. Even so, it is encouraging that the government recognizes the importance of its stewardship. Purely market led development of AI could imply all of its disruption, without any of the envisaged social benefits.

The General Data Protection Regulation and You

A cursory look at your email inbox this past month presents an intriguing trend. Multiple online services seem to have taken it upon themselves to notify changes to their Privacy Policies at the same time. The reason, simply, is that the European Union’s General Data Protection Regulation (GDPR) comes into force on May 25, 2018.

The GDPR marks a substantial overhaul of the existing data protection regime in the EU, as it replaces the earlier ‘Directive 95/46/EC on the protection of individuals with regard to the processing of personal data and on the free movement of such data.’ The Regulation was adopted by the European Parliament in 2016, with a period of almost two years to allow entities sufficient time to comply with their increased obligations.

The GDPR is an attempt to harmonize and strengthen data protection across Member States of the European Union. CCG has previously written about the Regulation and what it entails here. For one, the instrument is a ‘Regulation’, as opposed to a ‘Directive’. A Regulation is directly binding across all Member States in its entirety. A Directive simply sets out a goal that all EU countries must achieve, but allows them discretion as to how. Member States must enact national measures to transpose a Directive, and this can sometimes lead to a lack of uniformity across Member States.

The GDPR introduces, among other things, additional rights and protections for data subjects. This includes, for instance, the introduction of the right to data portability, and the codification of the controversial right to be forgotten. Our writing on these concepts can be found here, and here. Another noteworthy change is the substantial sanctions that can be imposed for violations. Entities that fall foul of the Regulation may have to pay fines up to 20 million Euros, or 4% of global annual turnover, whichever is higher.

The Regulation also has consequences for entities and users outside the EU. First, the Regulation has expansive territorial scope, and applies to non-EU entities if they offer goods and services to the EU, or monitor the behavior of EU citizens. The EU is also a significant digital market, which allows it to nudge other jurisdictions towards the standards it adopts. The Regulation (like the earlier Directive) restricts the transfer of personal data to entities outside the EU to cases where an adequate level of data protection can be ensured. This has resulted in many countries adopting regulation in compliance with EU standards. In addition, with the implementation of the GDPR, companies that operate in multiple jurisdictions might prefer to maintain parity between their data protection policies. For instance, Microsoft has announced that it will extend core GDPR protections to its users worldwide. As a consequence, many of the protections offered by the GDPR may in effect become available to users in other jurisdictions as well.

The implementation of the GDPR is also of particular significance to India, which is currently in the process of formulating its own data protection framework. The Regulation represents a recent attempt by a jurisdiction (that typically places a high premium on privacy) to address the harms caused by practices surrounding personal data. The lead-up to its adoption and implementation has generated much discourse on data protection and privacy. This can offer useful lessons as we debate the scope and ambit of our own data protection regulation.

Dealing With Revenge Porn in India

In March of 2018, a man in West Bengal was sentenced to five years imprisonment and fined Rs 9,000 for uploading private pictures and videos of a girl without her consent as revenge for ending their relationship. Under the promise of marriage, the accused pressured the complainant into providing explicit images of herself, and leveraged his threats to upload these pictures onto social media to acquire more pictures. Later, he accessed her phone without her knowledge to retrieve more private pictures and videos. When the complainant refused to continue their relationship, he uploaded this material onto a popular pornographic website along with both her and her father’s names. In addition to the defendant’s imprisonment and fine, the state government was directed to treat the victim as a survivor of rape and grant appropriate compensation. With evidence provided by service providers Reliance Jio and Google, the perpetrator was convicted under Sections 354A, 354C, 354 and 509 of the IPC as well as Sections 66E, 66C, 67 and 67A of the IT Act, in what is likely the first revenge porn conviction in India.

Revenge porn is a form of non-consensual pornography that came to international attention with the 2010 launch (and subsequent 2012 takedown) of the popular website IsAnyoneUp, which allowed users to upload nude photographs. While a number of these images were ostensibly self-submitted, many were revealed to have been submitted by angry ex-lovers, which would amount to ‘revenge porn’. Compounding the issue was the fact that these explicit images deliberately linked to the social media profiles of the person in the image.

According to Halder and Jaishankar, the essential elements of revenge porn are that the perpetrator and the victim shared an intimate relationship, and that the former has deliberately (and without the victim’s consent) released sexually explicit information online in order to cause distress and harm to the victim’s reputation.

While revenge porn is often used interchangeably with the term “non-consensual pornography”, it is distinct from other forms of non-consensual pornography such as rape videos, morphing or voyeurism. For instance, non-consensual pornography includes within its ambit sexually explicit images captured without a person’s knowledge or consent. However, revenge porn often includes such sensitive information that has voluntarily been captured or sent to the perpetrator in good faith in the course of an intimate relationship. Further, unlike in the case of revenge porn, not all perpetrators of non-consensual pornography are motivated by personal feelings such as revenge (as in the case of hackers who released intimate photos of more than 100 female celebrities after gaining access to their private iCloud accounts).

As a result, researchers are moving away from the term “revenge porn” as it can be somewhat misleading. “Revenge” limits the scope of this offence to motivations of personal vengeance, whereas such an act could be motivated by a desire for profit, notoriety, entertainment, or no reason at all. “Porn” implies that all images of nudity are intrinsically pornographic. Sexually explicit images created and shared within a private relationship should not be considered pornographic, unless they are distributed without consent, as this results in a private image being converted into public sexual entertainment. Accordingly, many victim advocates prefer to use the term “non-consensual pornography” or non-consensual sharing of intimate images.

Although the National Crime Records Bureau documents cyber-crimes against women, there are no official statistics available that pertain specifically to revenge porn in India. A 2010 report suggests that “only 35 per cent of the women have reported about their victimization, 46.7 per cent have not reported and 18.3 per cent have been unaware of the fact that they have been victimized … women prefer not to report about their victimization owing to social issues.” Victim-shaming (both by the criminal justice system and the public at large) is common, and the potential social fallout often extends to the victim’s family as well.

The recent surfeit of revenge porn has prompted many countries to enact legislation that criminalises it. These include the UK, many states in the USA, Canada, Australia, Japan and the Philippines.

At present however, there are no legal provisions that directly address revenge porn in India. While certain sections in the Indian Penal Code (IPC) and Information Technology (IT) Act can be invoked by victims, they fail to fully encompass the complexity of such cases and do not specifically target non-consensual pornography published online.

Section 354C of the IPC makes voyeurism punishable, and Explanation 2 to the Section deals with the non-consensual dissemination of consensually-captured images. However, this section limits its scope to female victims and male offenders.

In cases of non-consensual pornography (particularly those that involve morphing), victims can also seek recourse under Section 499 of the IPC for criminal defamation.

Section 66E of the IT Act punishes the transmission of images depicting the private areas of a person. The Explanation to the section limits private area to “… the naked or undergarment clad genitals, pubic area, buttocks or female breast”. This provision is gender-neutral and captures every aspect of revenge porn while not addressing it by name. However, the narrow definition of “private areas” in this case could limit the applicability of the act in cases where the victim is captured in an intimate position without showing those particular areas.

Section 67A of the IT Act punishes publication or transmission of “material containing sexually explicit acts, etc. in electronic form”. While this can effectively punish perpetrators, it also risks including within its ambit, victims who may have voluntarily captured and shared such private content with their partners.

The recent Supreme Court judgment recognising privacy as a fundamental right could have substantial implications on revenge porn and non-consensual pornography in general, in light of arguments recognising the right to bodily integrity. Copyright law could also potentially be used by victims, particularly when the content is a selfie. By claiming a violation of their copyright, a victim could potentially get such material taken down. While Indian copyright law does not presently provide any relief to victims of revenge porn, some victims in the US have successfully enforced their copyright to get such images taken down.

Social media platforms are often used to disseminate such content. Until recently, their role was limited to removing non-consensual pornography and other offensive images. However, there have been calls for them to play a more active role and filter this content before it is uploaded. Facebook has attempted to prevent the re-posting of revenge porn by cataloguing content which had been previously reported as revenge porn on its site.

The gender disparity in victims of non-consensual pornography is a reflection of the hostility still faced by women on the internet today. Involuntary porn today can be considered “the sexual harassment of 20 years ago. It’s an easy way to punish women for behaving in ways that they don’t approve of – for leaving them, for rejecting them, for being happy without them.”

SC Constitution Bench on Aadhaar – Final Hearing (Day XXXVIII)

In October 2015, a 3-judge bench of the Supreme Court of India referred challenges to the Aadhaar program to a constitution bench. One of the primary concerns of this petition was to decide on the existence of a fundamental right to privacy, which has since been upheld. Other similar petitions, concerned with the legitimacy of Aadhaar had been tagged with this petition. While the existence of the fundamental right to privacy has been upheld, challenges against the Aadhaar programme and linking services to this programme were yet to be adjudicated upon.

An interim order was passed in December of 2017, a summary of the arguments can be found here and here.

The final hearing commenced on January 17, 2018 and concluded on May 10, 2018. Summaries of the arguments advanced in the previous hearings can be found here.

Senior counsel Gopal Subramaniam continued with his rejoinder.

He started off by discussing the concept of dignity, stating that it was not meant to be promoted since it was assured under the preamble. He stated that dignity is inbuilt and would not depend on the largesse of the state.

Referring to the Aadhaar notifications, he stated that if the purpose of these notifications was to benefit individuals, the state would have to create conditions to ‘flower the dignity’ of people.

Justice Sikri commented on the duty of the state to provide benefits, which would also be a part of dignity. He stated that this would not just be applicable in cases of deprivation under Article 21 and that it would be an affirmative action.

Mr. Subramaniam agreed, stating that it would be an affirmative action. Further, he stated that the Act would have to be scrutinized to decipher whether it was an enabler or whether it was passed under the guise of enablement.

He then stated that the notifications begin with a preamble, which refers to the guarantee of seamless delivery of services.

He then discussed alternate forms of identification, like ration cards, stating that existing forms of identification were not failing, and Aadhaar therefore did not have a purpose.

He referred to examples of women in Jharkhand who could not get services despite having ration cards, due to failed Aadhaar authentication.

He also discussed the Essential Commodities Act and the central governments obligation.

He also discussed the applicability of the test of the ‘true purpose of the law’.

Further, he discussed the lack of machineries that were set up under the Aadhaar programme, stating that Aadhaar did not serve any special purpose since existing machineries were used to deliver services.

Mr. Subramaniam then went on to discuss the asymmetry of power under Section 7 of the Aadhaar Act.

He also stated that the Act was not enacted for a proper purpose. Further, he stated that the first step of legitimate aim was ‘proper purpose’. He also stated that it could only be justified if the right was preserved and that dignity and autonomy were not preserved under Section 7 of the Act.

Further on the issue of ‘proper purpose’, referring to the idea of ‘Socratic contestation’, he stated that a claim to a proper purpose would not qualify as a proper purpose.

Mr. Subramaniam then discussed the three letters of authentication. He stated that authentication was at the heart of the act and that failure of authentication was a ground for denial.  In relation to requesting entities, he discussed their lack of accountability under the law.

Further, he discussed the GDPR and the change in protocol. He also discussed the concerns about privacy of communication and not the privacy of individuals.

Further, he stated that there weren’t any other jurisdictions where the state could take all of its citizens data.

He also stated that declaration of human rights was necessary for this act. Further, he stated that the Act reduced people to numbers and also discussed the perils of using probabilistic algorithms.

Referring to Section 7 of the Act, he discussed ‘grants, subsidies, benefits’ as expressions of condescension.

Mr. Subramaniam further discussed the ‘power’ under the Act, stating that the power enables the collection of information.

He discussed the test in constitutional law, which was to question whether the state should logically be the holder of such information.

Further, he stated that if knowledge was power, giving information to the state would signal a ceding of power.

Justice Chandrachud commented on the nature of subsidies, to which Mr. Subramaniam stated that subsidy was provided at different levels of government.

Mr. Subramaniam then discussed Section 7 and stated that under it, strict rights were being bracketed. He further stated that it was not merely a segregation and that entitlements were being treated like grants.

On this Justice Chandrachud stated that wage payment was a benefit, to which Mr. Subramaniam responded, stating that wage payment would be a vested right.

He further discussed the Courts guidelines for rehabilitation laid down in 1982 in relation to bonded labourers, before deciding whether to rehabilitate or free them.

In this regard, he also discussed the incarceration of mentally ill citizens and the writs of mandamus issued to the Union.

Moving on, Mr. Subramaniam discussed census data and its use at federal and state levels. He stated that states had policies in regard to requesting data from the central planning commission.

He further stated that census data was a way of social mobilization, and that there was pre-existing data owing to the census.

He then discussed the concepts of horizontal protection and vertical protection, stating that the former was more important in the given instance.

Further, he discussed bodily integrity and autonomy as important considerations.

He also stated that ultimately, the fundamental freedoms in India must never be compared with the 4th Amendment under the United States constitution.  He also stated that the Indian constitution was a living document.

On the issue of Section 7, Justice Chandrachud stated that it is an enabling provision and not a mandate. He stated that it enabled the government to impose a mandate, the difference arising from may/shall.

Further, it was stated that these rights could not be ‘wielded down’. He also stated that there was no common denominator and rights could not be subsidies. Further on the issue of Part 3, he stated that the rights conduced to dignity.

Mr. Subramaniam then discussed identities and the dissolution of some kinds of identities.

He stated that if an act like manual scavenging was antithetical to the soul then he would want it destigmatized with the march of time. He further discussed how certain actions were akin to unmaking the dignity of people.

On the issue of fake profiles, he stated that it was not a matter of sticking up for fake profiles, but rather a matter of sticking up for better administration.

On the alleged voluntary nature of Aadhaar, he questioned how people could be asked to contract when they were not even under the capacity to contract.

Further, on the ‘legitimate aim’ of Aadhaar, he stated that collecting massive amounts of information would not satisfy this aim. He stated that the means used had to be adept and valid.

He also discussed the issue of two competing rights, which had to be balanced. In this regard, he discussed the concepts of freedom, autonomy, self-preservation and self-actualization. He also stated that the act of balancing had a direct correlation with seminal values and objectively protected values.

Mr. Subramaniam then stated that no contemporary studies on Aadhaar had taken place, the last one having been conducted a decade ago.

He then went on to read excerpts on an individual’s inalienable rights, stating that an individual should not be required to give up their rights.

Further, he stated that the procedure established by law had to be just, fair and reasonable.

On the Aadhaar project, he stated that there wasn’t merely a possibility of abuse, but that the Act postulated compelled behaviour.

He stated that the primary focus was that the judiciary had an obligation to protect fundamental rights.

Referring to the Constitution, he stated that it was a living document and should be seen as transformative. Further, he discussed parliamentary supremacy and the capacity to refuse. He stated that autonomy and integrity were intertwined in the capacity to refuse and if the capacity was obliterated, then the autonomy would also follow suit.

Mr. Subramaniam further discussed relief, stating that the petitioners would want the data stored to be taken down. He also stated that the Bench should exercise its powers under Article 32 and also rely on the case of Nilabati Behera.

Lastly, he stated that the propensity of information was an important consideration as well.

Next, senior counsel Anand Grover commenced with his rejoinder. He was brief, stating that none of the contentions of breaches of security had been dealt with and that privacy should not lose its character.

Next, senior counsel Arvind Datar commenced with his rejoinder.

He started off by stating that ‘pith and substance’ had no application to the legitimacy of an article and would not be applicable to a money bill.

Further, he discussed the difference between a money bill and a financial bill, stating that consolidated fund matters would be covered by financial bills.

He also discussed Article 117(1) in this context.

He stated that the Aadhaar Act could not have been passed as a financial bill.

Further, he discussed the doctrine of severability and whether certain portions of the Aadhaar Act could be removed.

He stated that the doctrine of severability could only apply if a statute was valid and certain portions are invalid. He stated that if the rest of the statute ‘made sense’ and was valid, it could be retained. However, in this instance, the statute itself was invalid, and relying on the Kihoto Hollohan case, he stated that a statute that was fatal at its inception could not be saved.

Mr. Datar also discussed the Mangalore Ganesh Beedi works case and subsequently Article 110(b) of the constitution.

Further, he discussed the issue of linking bank accounts to Aadhaar.

He stated that millions of bank accounts have already linked to Aadhaar and that permanent linking did not seem to serve a purpose and that accounts should be delinked once determination was over.

Next, senior counsel P.C. Chidambaram commenced his rejoinder. He discussed the issue of the Aadhaar Act being passed as a money bill.

He started off by discussing the interpretation of ‘only’ under Article 110(1), and went on to discuss how clause (g) must be read narrowly.

Lastly, he stated that a non-money bill being passed as a money bill would effectively limit the power of the Parliament, by disallowing review, which should not be condoned by the Court. He also stated that the doctrine of severability would not hold credence if the legislature was unconstitutional to begin with. Further, he discussed how the doctrine of pith and substance would not be applicable to bills passed under Article 110.

Next, senior counsel K.V. Viswanathan commenced his rejoinder. He discussed the theories of proportionality and balancing of rights. He stated that the balancing of rights proposition by the respondents was incorrect, and that fundamental rights would not survive. Further he discussed exception handling and the problem with making vested rights conditional on Section 7 of the Act. He also stated that citizens should not have to face the burden brought about by systems for ‘targeted and efficient delivery’.

Lastly, senior counsel P.V. Surendranath discussed the problem with excessive delegation.

The hearing concluded on the 10th of May and the matter is now reserved for judgment.