India is in the midst of establishing a robust data governance framework, which will impact the rights and liabilities of all key stakeholders – the government, private entities, and citizens at large. As a parliamentary committee debates its first personal data protection legislation (‘PDPB 2019’), proposals for the regulation of non-personal data and a data empowerment and protection architecture are already underway.
As data processing capabilities continue to evolve at a feverish pace, basic data protection regulations like the PDPB 2019 might not be sufficient to address new challenges. For example, big data analytics renders traditional notions of consent meaningless as users have no knowledge of how such algorithms behave and what determinations are made about them by such technology.
Creative data governance models, which are aimed at reversing the power dynamics in the larger data economy are the need of the hour. Recognising these challenges policymakers are driving the conversation on data governance in the right direction. However, they might be missing out on crucial experiments being run in other parts of the world.
As users of digital products and services increasingly lose control over data flows, various new models of data governance are being recommended for example, data trusts, data cooperatives, and data commons. Out of these, one of the most promising new models of data governance is – data trusts.
(For the purposes of this blog post, I’ll be using the phrase data processors as an umbrella term to cover data fiduciaries/controllers and data processors in the legal sense. The word users is meant to include all data principals/subjects.)
What are data trusts?
Though there are various definitions of data trusts, one which is helpful in understanding the concept is – ‘data trusts are intermediaries that aggregate user interests and represent them more effectively vis-à-vis data processors.’
To solve the information asymmetries and power imbalances between users and data processors, data trusts will act as facilitators of data flow between the two parties, but on the terms of the users. Data trusts will act in fiduciary duty and in the best interests of its members. They will have the requisite legal and technical knowledge to act on behalf of users. Instead of users making potentially ill-informed decisions over data processing, data trusts will make such decisions on their behalf, based on pre-decided factors like a bar on third-party sharing, and in their best interests. For example, data trusts to users can be what mutual fund managers are to potential investors in capital markets.
Currently, in a typical transaction in the data economy, if users wish to use a particular digital service, neither do they have the knowledge to understand the possible privacy risks nor the negotiation powers for change. Data trusts with a fiduciary responsibility towards users, specialised knowledge, and multiple members might be successful in tilting back the power dynamics in favour of users. Data trusts might be relevant from the perspective of both the protection and controlled sharing of personal as well as non-personal data.
(MeitY’s Non-Personal Data Governance Framework introduces the concept of data trustees and data trusts in India’s larger data governance and regulatory framework. But, this applies only to the governance of ‘non-personal data’ and not personal data, as being recommended here. CCG’s comments on MeitY’s Non-Personal Data Governance Framework, can be accessed – here)
Challenges with data trusts
Though creative solutions like data trusts seem promising in theory, they must be thoroughly tested and experimented with before wide-scale implementation. Firstly, such a new form of trusts, where the subject matter of the trust is data, is not envisaged by Indian law (see section 8 of the Indian Trusts Act, 1882, which provides for only property to be the subject matter of a trust). Current and even proposed regulatory structures don’t account for the regulation of institutions like data trusts (the non-personal data governance framework proposes data trusts, but only as data sharing institutions and not as data managers or data stewards, as being suggested here). Thus, data trusts will need to be codified into Indian law to be an operative model.
Secondly, data processors might not embrace the notion of data trusts, as it may result in loss of market power. Larger tech companies, who have existing stores of data on numerous users may not be sufficiently incentivised to engage with models of data trusts. Structures will need to be built in a way that data processors are incentivised to participate in such novel data governance models.
Thirdly, the business or operational models for data trusts will need to be aligned to their members i.e. users. Data trusts will require money to operate – for profit entities may not have the best interests of users in mind. Subscription based models, whether for profit or not, might fail as users are habitual to free services. Donation based models might need to be monitored closely for added transparency and accountability.
Lastly, other issues like creation of technical specifications for data sharing and security, contours of consent, and whether data trusts will help in data sharing with the government, will need to be accounted for.
Privacy centric data governance models
At this early stage of developing data governance frameworks suited to Indian needs, policymakers are at a crucial juncture of experimenting with different models. These models must be centred around the protection and preservation of privacy rights of Indians, both from private and public entities. Privacy must also be read in its expansive definition as provided by the Supreme Court in JusticeK.S. Puttaswamy vs. Union of India. The autonomy, choice, and control over informational privacy are crucial to the Supreme Court’s interpretation of privacy.
(CCG’s privacy law database that tracks privacy jurisprudence globally and currently contains information from India and Europe, can be accessed – here)
In Greek mythology Argus Panoptes was a many-eyed, all-seeing, and always awake, giant whose reference has been used to depict an imagery of excessive scrutiny and surveillance. Jeremy Bentham used this reference when he designed the panopticon prison where prisoners would be monitored without them being in the know. Later, Michel Foucault used the panopticon to elaborate on the social theory of panopticism where the watcher ceases to be external to the watched, resulting in internal surveillance or a ‘chilling’ effect. This idea of “panopticism” has gained renewed relevance in the age of digital surveillance.
Amongst the many cutting edge surveillance technologies being adopted globally, ‘Facial Recognition Technology’ (FRT) is one of the most rapidly deployed. ‘Live Facial Recognition Technology’ (LFRT) or ‘Real-time Facial Recognition Technology’, its augmentation, has increasingly become more effective in the past few years. Improvements in computational power and algorithms have enabled cameras placed at odd angles to detect faces even in motion. This post attempts to explore the issues with increasing State use of FRT around the world and the legal framework surrounding it.
What do FRT and LFRT mean?
FRT refers to the usage of algorithms for uniquely detecting, recognising, or verifying a person using recorded images, sketches, videos (which contain their face). The data about a particular face is generally known as the face template. This template is a mathematical representation of a person’s face, which is created by using algorithms that mark and map distinct features on the captured image like eye locations or the length of a nose. These face templates create the biometric database against which new images, sketches, videos, etc. are compared to verify or recognise the identity of a person. As opposed to the application of FRT, which is conducted on pre-recorded images and videos, LFRT involves real-time automated facial recognition of all individuals in the camera field’s vision. It involves biometric processing of images of all the passers-by using an existing database of images as a reference.
The accuracy of FRT algorithms is significantly impacted by factors like distance and angle from which the image was captured or poor lighting conditions. These problems are worsened in LFRT as the images are not captured in a controlled setting, with the subjects in motion, rarely looking at the camera, and often positioned at odd angles from it.
While FRT has become ubiquitous, LFRT is still in the process of being adopted in countries like the UK, USA, India, and Singapore. The COVID-19 pandemic has further accelerated the adoption of FRT as a way to track the virus’ spread and to build on contactless biometric-based identification systems. For example, in Moscow, city officials were using a system of tens of thousands of cameras equipped with FRT, to check for social distancing measures, usage of face masks, and adherence to quarantine rules to contain the spread of COVID-19.
FRT is also being steadily deployed for mass surveillance activities, which is often in violation of universally accepted principles of human rights such as necessity and proportionality. These worries have come to the forefront recently with the State use of FRT to identify people participating in protests. For example, FRT was used by law enforcement agencies to identify prospective law breakers during protests in Hong Kong, protests concerning the Citizenship Amendment Act, 2019 in New Delhi and the Black Lives Matter protests across the USA.
Vociferous demands have been made by civil society and digital rights groups for a globalmoratorium on the pervasive use of FRT that enables mass surveillance, as many cities such as Boston and Portland have banned its deployment. However, it remains to be seen how effective these measures are in halting the use of FRT. Even the temporary refusal by Big Tech companies to sell FRT to police forces in the US does not seem to have much instrumental value – as other private companies continue its unhindered support.
Regulation of FRT
The approach to the regulation of FRT differs vastly across the globe. The regulation spectrum on FRT ranges from permissive use of mass surveillance on citizens in countries like China and Russia to a ban on the use of FRT for example in Belgium and Boston (in USA). However, in many countries around the world, including India, the use of FRT continues unabated, worryingly in a regulatory vacuum.
Recently, an appellate court in the UK declared the use of LFRT for law enforcement purposes as unlawful, on grounds of violation of the rights of data privacy and equality. Despite the presence of a legal framework in the UK for data protection and the use of surveillance cameras, the Court of Appeal held that there was no clear guidance on the use of the technology and it gave excessive discretion to the police officers.
The EU has been contemplating a moratorium on the use of FRT in public places. Civil society in the EU is demanding a comprehensive and indefinite ban on the use of FRT and related technology for mass surveillance activities.
In the USA, several orders banning or heavily regulating the use of FRT have been passed. A federal law banning the use of facial recognition and biometric technology by law enforcement has been proposed. The bill seeks to place a moratorium on the use of facial recognition until Congress passes a law to lift the temporary ban. It would apply to federal agencies such as the FBI, as well as local and State police departments.
The Indian Scenario
In July 2019, the Government of India announced its intentions of setting up a nationwide facial recognition system. The National Crime Bureau (NCRB) – a government agency operating under the Ministry of Home Affairs – released a request for proposal (RFP) on July 4, 2019 to procure a National Automated Facial Recognition System (AFRS). The deadline for submission of tenders to the RFP has been extended 11 times since July 2019. The stated aim of the AFRS is to help modernise the police force, information gathering, criminal identification, verification, and its dissemination among various police organisations and units across the country.
Security forces across the states and union territories will have access to the centralised database of AFRS, which will assist in the investigation of crimes. However, civil society organisations have raised concerns regarding privacy and issues of increased surveillance by the State as AFRS does not have a legal basis (statutory or executive) and lacks procedural safeguards and accountability measures like an oversight regulatory authority. They have also questioned the accuracy of FRT in identifying darker skinned women and ethnic minorities and expressed fears of discrimination.
This is in addition to the FRT already in use by law enforcement agencies in Chennai, Hyderabad, Delhi, and Punjab. There are several instances of deployment of FRT in India by the government in the absence of a specific law regulating FRT or a general data protection law.
Even the proposed Personal Data Protection Bill, 2019 is unlikely to assuage privacy challenges arising from the use of FRT by the Indian State. The primary reason for this is the broad exemptions provided to intelligence and law enforcement agencies under Clause 35 of the Bill on grounds of sovereignty and integrity, security of the State, public order, etc.
After the judgement of K.S. Puttaswamy vs. Union of India (Puttaswamy I), which reaffirmed the fundamental right to privacy in India, any act of State surveillance breaches the right to privacy and will need to adhere to the three part test laid down in Puttaswamy I.
The three prongs of the test are – legality, which postulates the existence of law along with procedural safeguards; necessity, defined in terms of a legitimate State aim; and proportionality which ensures a rational nexus between the objects and the means adopted to achieve them. This test was also applied in the Aadhaar case (Puttaswamy II) to the use of biometrics technology.
It may be argued that State use of FRT is for the legitimate aim of ensuring national security, but currently its use is neither sanctioned by law, nor does it pass the test of proportionality. For proportionate use of FRT, the State will need to establish that there is a rational nexus between its use and the purpose sought to be achieved and that the use of such technology is the least privacy restrictive measure to achieve the intended goals. As the law stands today in India after Puttaswamy I and II, any use of FRT or LFRT currently is prima facie unconstitutional.
While mass surveillance is legally impermissible in India, targeted surveillance is allowed under Section 5 of the Indian Telegraph Act, 1885, read with rule 419A of the Indian Telegraph Rules, 1951 and Section 69 of the Information and Technology Act, 2000 (IT Act). Even the constitutionality of Section 69 of the IT Act has been challenged and is currently pending before the Supreme Court.
Puttaswamy I has clarified that the protection of privacy is not completely lost or surrendered in a public place as it is attached to the person. Hence, the constitutionality of India’s surveillance apparatus needs to be assessed from the standards laid down by Puttaswamy I. To check unregulated mass surveillance through the deployment of FRT by the State, there is a need to restructure the overall surveillance regime in the country. Even the Justice Srikrishna Committee report in 2018 – highlighted that several executive sanctioned intelligence-gathering activities of law enforcement agencies would be illegal after Puttaswamy I as they do not operate under any law.
The need for reform of surveillance laws, in addition to a data protection law in India to safeguard fundamental rights and civil liberties, cannot be stressed enough. The surveillance law reform will have to focus on the use of new technologies like FRT and regulate its deployment with substantive and procedural safeguards to prevent abuse of human rights and civil liberties and provide for relief.
Well documented limitations of FRT and LFRT in terms of low accuracy rates, along with concerns of profiling and discrimination, make it essential for the surveillance law reform to have additional safeguards such as mandatory accuracy and non-discrimination audits. For example, the National Institute of Standards and Technology (NIST), US Department of Commerce, 2019 Face Recognition Vendor Test (part three) evaluates whether an algorithm performs differently across different demographics in a dataset. The need of the hour is to cease the use of FRT and put a temporary moratorium on any future deployments till surveillance law reforms with adequate proportionality safeguards have been implemented.
The Centre’s Non Personal Data Report proposes a policy framework to regulate the use of anonymised data used by Big Tech companies. The question now is: how well do its recommendations meet up to the challenges of regulating non-personal data, amidst a regulatory lacuna for the same? Shashank Mohan of the Centre for Communication Governance explores how concepts of collective privacy and data trusts lie at the forefront of India’s future frameworks for digital governance.
By Shashank Mohan
This post first appeared on The Bastion on September 13, 2020
In the past few years, it has become common knowledge that Big Tech companies like Facebook, Google, and Amazon rely on the exploitation of user data to offer seemingly free services. These companies typically use business models that rely on third party advertising to profit off this data. In exchange for their services, we hand over our data without much control or choice in the transaction.
But, why do we need to regulate non-personal data?
While progress on the regulation of personal data is necessary and laudable, in the era of Big Data and machine learning, tech companies no longer need to solely rely on processing our personally identifiable data (personal data) to profile or track users. With newer developments in data analytics, they can find patterns and target us using seemingly innocuous data that may be aggregated or anonymised, but doesn’t need to be identifiable.
For example, they only need to know that I am a brown male in the age range of 25-35, from New Delhi, looking for shoes, and not necessarily my name or my phone number. All of this is “non-personal” data as it’s not linked to my personal identity.
Clearly, tech companies extract value from their service offerings using advanced data analytics and machine learning algorithms which rummage through both personal and non-personal data. This shift to harnessing non-identifiable/anonymised/aggregated data creates a lacuna in the governance of data, as traditionally, data protection laws like the GDPR have focused on identifiable data and giving an individual control over their personal data.
So, among other economic proposals, the NPD Report proposes a policy framework to regulate such anonymised data, to fill this lacuna. The question now is: how well do its recommendations meet up to the challenges of regulating non-personal data?
How Does The Government Define Non-Personal Data?
The NPD Report proposes the regulation of non-personal data, which it defines as data that is never related to an identifiable person, such as data on weather conditions, or personal (identifiable) data which has been rendered anonymous by applying certain technological techniques (such as data anonymisation). The report also recommends the mandatory cross-sharing of this non-personal data between companies, communities of individuals, and the government. The purpose for which this data may be mandated to be shared falls under three broad buckets: national security, community benefit, and promoting market competition.
However, if such data is not related to an identifiable individual, then how can it be protected under personal data privacy laws?
To address these challenges in part, the report introduces two key concepts: collective privacy and data trusts.
The NPD Report defines collective privacy as a right emanating from a community or group of people that are bound by common interests and purposes. It recommends that communities or a group of people exercise control over their non-personal data–which is distinct from an individual exercising control of their personal data–and do so via an appropriate nominee called a data trustee, who would exercise their privacy rights on behalf of the entire community. These two interconnected concepts of collective privacy and data trusteeship merit deeper exploration, due to their significant impact on how we view privacy rights in the digital age.
What is Collective Privacy and How Shall We Protect It?
The concept of collective privacy shifts the focus from an individual controlling their privacy rights, to a group or a community having data rights as a whole. In the age of Big Data analytics, the NPD Report does well to discuss the risks of collective privacy harms to groups of people or communities. It is essential to look beyond traditional notions of privacy centered around an individual, as Big Data analytical tools rarely focus on individuals, but on drawing insights at the group level, or on “the crowd” of technology users.
In a revealing example from 2013, data processors who accessed New York City’s taxi trip data (including trip dates and times) were able to infer with a degree of accuracy whether a taxi driver was a devout Muslim or not, even though data on the taxi licenses and medallion numbers had been anonymised. Data processors linked pauses in taxi trips with adherence to regularly timed prayer timings to arrive at their conclusion. Such findings and classifications may result in heightened surveillance or discrimination for such groups or communities as a whole.
An example of such a community in the report itself is of people suffering from a socially stigmatised disease who happen to reside in a particular locality in a city. It might be in the interest of such a community to keep details about their ailment and residence private, as even anonymised data pointing to their general whereabouts could lead to harassment and the violation of their privacy.
In such cases, harms arise not specifically to an individual, but to a group or community as a whole. Even if data is anonymised (and rendered completely un-identifiable), insights drawn at a group level help decipher patterns and enable profiling at the macro level.
However, the community suffering from the disease might also see some value in sharing limited, anonymised data on themselves with certain third parties; for example, with experts conducting medical research to find a cure to the disease. Such a group may nominate a data trustee–as envisioned by the NPD Report–who facilitates the exchange of non-personal data on their behalf, and takes their privacy interests into account with relevant data processors.
This model of data trusteeship is thus clearly envisioned as a novel intermediary relationship–distinct from traditional notions of a legal trust or trustee for the management of property–between users and data trustees to facilitate the proper exchange of data, and protect users against privacy harms like large-scale profiling and behavioral manipulation.
But, what makes data trusts unique?
Are Data Trusts the New ‘Mutual Funds’?
Currently, data processors process a wide-range of data–both personal and non-personal–about users, without providing them accessible information about how they use or collect it. These users, if they wish to use services offered by data processors, do not have any negotiating powers over the collection or processing of their data. This results in information asymmetries and power imbalances between both parties, without much recourse to users–especially in terms of non-personal data which is not covered by personal data protection laws like the GDPR, or India’s Draft Personal Data Protection Bill.
Data trusts can help solve the challenges arising during everyday data transactions taking place on the Internet. Acting as experts on behalf of users, they may be in a better position to negotiate for privacy-respecting practices as compared to individual users. By standardising data sharing practices like data anonymisation and demanding transparency in data usage, data trusts may also be better placed to protect collective privacy rights as compared to an unstructured community. One of the first recommendations to establish data trusts in the public fora came from the UK Government’s independent report from 2017, ‘Growing the artificial intelligence industry in the UK’, which recommended the establishment of data trusts for increased access to data for AI systems.
Simply put: data trusts might be akin to mutual fund managers, as they facilitate complex investments on behalf of and in the best interests of their individual investors.
The Fault in Our Data Sarkaar
Since data trusts are still untested at a large scale, certain challenges need to be anticipated at the time of their conceptualisation, which the NPD Report does not take account of.
For example, in some cases, the report suggests that the role of the data trustee could be assumed by an arm of the government. The Ministry of Health and Family Welfare, for instance, could act as a trustee for all data on diabetes for Indian citizens.
However, the government acting as a data trustee raises important questions of conflict of interest–after all, government agencies might utilise relevant non-personal data for the profiling of citizens. The NPD Report doesn’t provide solutions for such challenges.
Additionally, the NPD Report doesn’t clarify the ambiguity in the relationship between data trusts and data trustees, adding to the complexity of its recommendations. While the report envisions data trusts as institutional structures purely for the sharing of given data sets, it defines data trustees as agents of ‘predetermined’ communities who are tasked with protecting their data rights.
Broadly, this is just like how commodities (like stocks or gold) are traded over an exchange (such as data trusts) while agents such as stockbrokers (or data trustees) assist investors in making their investments. This is distinct from how Indian law treats traditional conceptions of trusts and trustees, and might require fresh law for its creation.
In terms of the exchange of non-personal data, possibly both these tasks–that is, facilitating data sharing and protecting data rights of communities/groups–can be delegated to just one entity: data trusts. Individuals who do not form part of any ‘predetermined’ community–and thus may not find themselves represented by an appropriate trustee–may also benefit from such hybrid data trusts for the protection of their data rights.
Clearly, multiple cautionary steps need to be in place for data trusts to work, and for the privacy of millions to be protected–steps yet to be fully disclosed in the Report.
Firstly, there is a need for legal and regulatory mechanisms that will ensure that these trusts genuinely represent the best interests of their members. Without a strong alignment with regulatory policies, data trusts might enable the further exploitation of data, rather than bringing about reforms in data governance. Borrowing from traditional laws on trusts, a genuine representation of interests can be ensured by placing a legal obligation–in the form of an enforceable trust deed– on the trust of a fiduciary duty (or duty of care) towards its members.
Secondly, data trusts will require money to operate, and developing funding models that ensure the independence of trusts and also serve their members’ best interests. Various models will need to be tested before implementation, including government funded data trusts and user-subscription based systems.
Thirdly, big questions about the transparency of data trusts remain. As these institutions may be the focal point of data exchange in India, ensuring their independence and accountability will be crucial. Auditing, continuous reviews, and reporting mechanisms will need to be enmeshed in future regulation to ensure the accountability of data trusts.
Privacy Rights Must Be Paramount
As the law tries to keep pace with technology in India, recognising new spheres which require immediate attention, like the challenges of collective privacy, becomes pertinent for policymakers. The NPD Report takes momentous strides in recognising some of these challenges which require swift redressal, but fails to take into consideration emerging scholarship on the autonomy, transparency, and strength of its proposed data trusts.
For example, large data processors will need to be incentivised to engage with data trusts. Smaller businesses may engage with data trusts easily considering the newfound easy access to large amounts of data. But, it might be difficult to incentivise Big Tech companies to engage with such structures, due to their existing stores of wide-scale data on millions of users. This is where the government will need to go back to the drawing board and engage with multiple stakeholders to ensure that innovation goes hand in hand with a privacy respecting data governance framework. Novel solutions like data trusts should be tested with pilot projects, before being baked into formal policy or law.
More than three years after India’s Supreme Court reaffirmed the right to privacy as intrinsic to human existence and a guarantee under the Indian Constitution, government policy continues to treat data–whether personal or non-personal–as a resource to be ‘mined’. In this atmosphere, to meaningfully recognise the right to privacy and self-determination, the government must lay down a data governance framework which seeks to protect the rights of users (or data providers), lays down principles of transparency and accountability, and establishes strong institutions for enforcement of the law.
(This post is in context of the report released by the Committee of Experts on Personal Data Governance Framework, as constituted by the Ministry of Electronics and Information Technology. CCG’s comments on the report can be accessed here)
The Personal Data Protection Bill, 2019 (PDP Bill/ Bill) was introduced in the Lok Sabha on December 11, 2019 , and was immediately referred to a joint committee of the Parliament. The joint committee published a press communique on February 4, 2020 inviting comments on the Bill from the public.
The Bill is the successor to the Draft Personal Data Protection Bill 2018 (Draft Bill 2018), recommended by a government appointed expert committee chaired by Justice B.N. Srikrishna. In August 2018, shortly after the recommendations and publication of the draft Bill, the Ministry of Electronics and Information Technology (MeitY) invited comments on the Draft Bill 2018 from the public. (Our comments are available here.)
In this post we undertake a preliminary examination of:
The scope and applicability of the PDP Bill
The application of general data protection principles
The rights afforded to data subjects
The exemptions provided to the application of the law
In future posts in the series we will examine the Bill and look at the:
The restrictions on cross border transfer of personal data
The structure and functions of the regulatory authority
The enforcement mechanism and the penalties under the PDP Bill
Scope and Applicability
The Bill identifies four different categories of data. These are personal data, sensitive personal data, critical personal data and non-personal data
Personal data is defined as “data about or relating to a natural person who is directly or indirectly identifiable, having regard to any characteristic, trait, attribute or any other feature of the identity of such natural person, whether online or offline, or any combination of such features with any other information, and shall include any inference drawn from such data for the purpose of profiling”. (emphasis added)
The addition of inferred data in the definition realm of personal data is an interesting reflection of the way the conversation around data protection has evolved in the past few months, and requires further analysis.
Sensitive personal data is defined as data that may reveal, be related to or constitute a number of different categories of personal data, including financial data, health data, official identifiers, sex life, sexual orientation, genetic data, transgender status, intersex status, caste or tribe, and religious and political affiliations / beliefs. In addition, under clause 15 of the Bill the Central Government can notify other categories of personal data as sensitive personal data in consultation with the Data Protection Authority and the relevant sectoral regulator.
Similar to the 2018 Bill, the current bill does not define critical personal data and clause 33 provides the Central Government the power to notify what is included under critical personal data. However, in its report accompanying the 2018 Bill, the Srikrishna committee had referred to some examples of critical personal data that relate to critical state interest like Aadhaar number, genetic data, biometric data, health data, etc.
The Bill retains the terminology introduced in the 2018 Draft Bill, referring to data controllers as ‘data fiduciaries’ and data subjects ‘data principals’. The new terminology was introduced with the purpose of reflecting the fiduciary nature of the relationship between the data controllers and subjects. However, whether the use of the specific terminology has more impact on the protection and enforcement of the rights of the data subjects still needs to be seen.
Application of PDP Bill 2019
The Bill is applicable to (i) the processing of any personal data, which has been collected, disclosed, shared or otherwise processed in India; (ii) the processing of personal data by the Indian government, any Indian company, citizen, or person/ body of persons incorporated or created under Indian law; and (iii) the processing of personal data in relation to any individuals in India, by any persons outside of India.
The scope of the 2019 Bill, is largely similar in this context to that of the 2018 Draft Bill. However, one key difference is seen in relation to anonymised data. While the 2018 Draft Bill completely exempted anonymised data from its scope, the 2019 Bill does not apply to anonymised data, except under clause 91 which gives the government powers to mandate the use and processing of non-personal data or anonymised personal data under policies to promote the digital economy. There are a few concerns that arise in context of this change in treatment of anonymised personal data. First, there are concerns on the concept of anonymisation of personal data itself. While the Bill provides that the Data Protection Authority (DPA) will specify appropriate standards of irreversibility for the process of anonymisation, it is not clear that a truly irreversible form of anonymisation is possible at all. In this case, we need more clarity on what safeguards will be applicable for the use of anonymised personal data.
Second, is the Bill’s focus on the promotion of the digital economy. We have previously discussed some of the concerns regarding focus on the promotion of digital economy in a rights based legislation inour comments to the Draft Bill 2018.
These issues continue to be of concern, and are perhaps heightened with the introduction of a specific provision on the subject in the 2019 Bill (especially without adequate clarity on what services or policy making efforts in this direction, are to be informed by the use of anonymised personal data). Many of these issues are also still under discussion by thecommittee of experts set up to deliberate on data governance framework (non-personal data). The mandate of this committee includes the study of various issues relating to non-personal data, and to make specific suggestions for consideration of the central government on regulation of non-personal data.
The formation of the non-personal data committee was in pursuance of a recommendation by the Justice Srikrishna Committee to frame a legal framework for the protection of community data, where the community is identifiable. The mandate of the expert committee will overlap with the application of clause 91(2) of the Bill.
Data Fiduciaries, Social Media Intermediaries and Consent Managers
As discussed above the Bill categorises data controllers as data fiduciaries and significant data fiduciaries. Any person that determines the purpose and means of processing of personal data, (including the State, companies, juristic entities or individuals) is considered a data fiduciary. Some data fiduciaries may be notified as ‘significant data fiduciaries’, on the basis of factors such as the volume and sensitivity of personal data processed, the risks of harm etc. Significant data fiduciaries are held to higher standards of data protection. Under clauses 27-30, significant data fiduciaries are required to carry out data protection impact assessments, maintain accurate records, audit policy and the conduct of its processing of personal data and appoint a data protection officer.
Social Media Intermediaries
The Bill introduces a distinct category of intermediaries called social media intermediaries. Under clause 26(4) a social media intermediary is ‘an intermediary who primarily or solely enables online interaction between two or more users and allows them to create, upload, share, disseminate, modify or access information using its services’. Intermediaries that primarily enable commercial or business-oriented transactions, provide access to the Internet, or provide storage services are not to be considered social media intermediaries.
Social media intermediaries may be notified to be significant data fiduciaries, if they have a minimum number of users, and their actions have or are likely to have a significant impact on electoral democracy, security of the State, public order or the sovereignty and integrity of India.
Under clause 28 social media intermediaries that have been notified as a significant data fiduciaries will be required to provide for voluntary verification of users to be accompanied with a demonstrable and visible mark of verification.
The Bill also introduces the idea of a ‘consent manager’ i.e. a (third party) data fiduciary which provides for management of consent through an ‘accessible, transparent and interoperable platform’. The Bill does not contain any details on how consent management will be operationalised, and only states that these details will be specified by regulations under the Bill.
Data Protection Principles and Obligations of Data Fiduciaries
Consent and grounds for processing
The Bill recognises consent as well as a number of other grounds for the processing of personal data.
Clause 11 provides that personal data shall only be processed if consent is provided by the data principal at the commencement of processing. This provision, similar to the consent provision in the 2018 Draft Bill, draws from various principles including those under the Indian Contract Act, 1872 to inform the concept of valid consent under the PDP Bill. The clause requires that the consent should be free, informed, specific, clear and capable of being withdrawn.
Moreover, explicit consent is required for the processing of sensitive personal data. The current Bill appears to be silent on issues such as incremental consent which were highlighted in our comments in the context of the Draft Bill 2018.
The Bill provides for additional grounds for processing of personal data, consisting of very broad (and much criticised) provisions for the State to collect personal data without obtaining consent. In addition, personal data may be processed without consent if required in the context of employment of an individual, as well as a number of other ‘reasonable purposes’. Some of the reasonable purposes, which were listed in the Draft Bill 2018 as well, have also been a cause for concern given that they appear to serve mostly commercial purposes, without regard for the potential impact on the privacy of the data principal.
In a notable change from the Draft Bill 2018, the PDP Bill, appears to be silent on whether these other grounds for processing will be applicable in relation to sensitive personal data (with the exception of processing in the context of employment which is explicitly barred).
The Bill also incorporates a number of traditional data protection principles in the chapter outlining the obligations of data fiduciaries. Personal data can only be processed for a specific, clear and lawful purpose. Processing must be undertaken in a fair and reasonable manner and must ensure the privacy of the data principal – a clear mandatory requirement, as opposed to a ‘duty’ owed by the data fiduciary to the data principal in the Draft Bill 2018 (this change appears to be in line with recommendations made in multiple comments to the Draft Bill 2018 by various academics, including our own).
Purpose and collection limitation principles are mandated, along with a detailed description of the kind of notice to be provided to the data principal, either at the time of collection, or as soon as possible if the data is obtained from a third party. The data fiduciary is also required to ensure that data quality is maintained.
A few changes in the application of data protection principles, as compared to the Draft Bill 2018, can be seen in the data retention and accountability provisions.
On data retention, clause 9 of the Bill provides that personal data shall not be retained beyond the period ‘necessary’ for the purpose of data processing, and must be deleted after such processing, ostensibly a higher standard as compared to ‘reasonably necessary’ in the Draft Bill 2018. Personal data may only be retained for a longer period if explicit consent of the data principal is obtained, or if retention is required to comply with law. In the face of the many difficulties in ensuring meaningful consent in today’s digital world, this may not be a win for the data principal.
Clause 10 on accountability continues to provide that the data fiduciary will be responsible for compliance in relation to any processing undertaken by the data fiduciary or on its behalf. However, the data fiduciary is no longer required to demonstrate such compliance.
Rights of Data Principals
Chapter V of the PDP Bill 2019 outlines the Rights of Data Principals, including the rights to access, confirmation, correction, erasure, data portability and the right to be forgotten.
Right to Access and Confirmation
The PDP Bill 2019 makes some amendments to the right to confirmation and access, included in clause 17 of the bill. The right has been expanded in scope by the inclusion of sub-clause (3). Clause 17(3) requires data fiduciaries to provide data principals information about the identities of any other data fiduciaries with whom their personal data has been shared, along with details about the kind of data that has been shared.
This allows the data principal to exert greater control over their personal data and its use. The rights to confirmation and access are important rights that inform and enable a data principal to exercise other rights under the data protection law. As recognized in the Srikrishna Committee Report, these are ‘gateway rights’, which must be given a broad scope.
Right to Erasure
The right to correction (Clause 18) has been expanded to include the right to erasure. This allows data principals to request erasure of personal data which is not necessary for processing. While data fiduciaries may be allowed to refuse correction or erasure, they would be required to produce a justification in writing for doing so, and if there is a continued dispute, indicate alongside the personal data that such data is disputed.
The addition of a right to erasure, is an expansion of rights from the 2018 Bill. While the right to be forgotten only restricts or discontinues disclosure of personal data, the right to erasure goes a step ahead and empowers the data principal to demand complete removal of data from the system of the data fiduciary.
Many of the concerns expressed in the context of the Draft Bill 2018, in terms of the procedural conditions for the exercise of the rights of data principals, as well as the right to data portability specifically, continue to persist in the PDP Bill 2019.
Exceptions and Exemptions
While the PDP Bill ostensibly enables individuals to exercise their right to privacy against the State and the private sector, there are several exemptions available, which raise several concerns.
The Bill grants broad exceptions to the State. In some cases, it is in the context of specific obligations such as the requirement for individuals’ consent. In other cases, State action is almost entirely exempted from obligations under the law. Some of these exemptions from data protection obligations are available to the private sector as well, on grounds like journalistic purposes, research purposes and in the interests of innovation.
The most concerning of these provisions, are the exemptions granted to intelligence and law enforcement agencies under the Bill. The Draft Bill 2018, also provided exemptions to intelligence and law enforcement agencies, so far as the privacy invasive actions of these agencies were permitted under law, and met procedural standards, as well as legal standards of necessity and proportionality. We have previously discussed some of the concerns with this approach here.
The exemptions provided to these agencies under the PDP Bill, seem to exacerbate these issues.
Under the Bill, the Central Government can exempt an agency of the government from the application of this Act by passing an order with reasons recorded in writing if it is of the opinion that the exemption is necessary or expedient in the interest of sovereignty and integrity, security of the state, friendly relations with foreign states, public order; or for preventing incitement to the commission of any cognizable offence relating to the aforementioned grounds. Not only have the grounds on which government agencies can be exempted been worded in an expansive manner, the procedure of granting these exemptions also is bereft of any safeguards.
The executive functioning in India suffers from problems of opacity and unfettered discretion at times, which requires a robust system of checks and balances to avoid abuse. The Indian Telegraph Act, 1885 (Telegraph Act) and the Information Technology Act, 2000 (IT Act) enable government surveillance of communications made over telephones and the internet. For drawing comparison here, we primarily refer to the Telegraph Act as it allows the government to intercept phone calls on similar grounds as mentioned in clause 35 of the Bill by an order in writing. However, the Telegraph Act limits the use of this power to two scenarios – occurrence of a public emergency or in the interest of public safety. The government cannot intercept communications made over telephones in the absence of these two preconditions. The Supreme Court in People’s Union for Civil Liberties v. Union of India, (1997) introduced guidelines to check abuse of surveillance powers under the Telegraph Act which were later incorporated in Rule 419A of the Indian Telegraph Rules, 1951. A prominent safeguard included in Rule 419A requires that surveillance and monitoring orders be issued only after considering ‘other reasonable means’ for acquiring the required information. The court had further limited the scope of interpretation of ‘public emergency’ and ‘public safety’ to mean “the prevalence of a sudden condition or state of affairs affecting the people at large and calling for immediate action”, and “the state or condition of freedom from danger or risk at large” respectively. In spite of the introduction of these safeguards, the procedure of intercepting telephone communications under the Telegraph Act is criticised for lack of transparency and improper implementation. For instance, a 2014 report revealed that around 7500 – 9000 phone interception orders were issued by the Central Government every month. The application of procedural safeguards, in each case would have been physically impossible given the sheer numbers. Thus, legislative and judicial oversight becomes a necessity in such cases.
The constitutionality of India’s surveillance apparatus inclduing section 69 of the IT Act which allows for surveillance on broader grounds on the basis of necessity and expediency and not ‘public emergency’ and ‘public safety’, has been challenged before the Supreme Court and is currently pending. Clause 35 of the Bill also mentions necessity and expediency as prerequisites for the government to exercise its power to grant exemption, which appear to be vague and open-ended as they are not defined. The test of necessity, implies resorting to the least intrusive method of encroachment up on privacy to achieve the legitimate state aim. This test is typically one among several factors applied in deciding on whether a particular intrusion on a right is tenable or not, under human rights law. In his concurring opinion in Puttaswamy (I) J. Kaul had included ‘necessity’ in the proportionality test. (However, this test is not otherwise well developed in Indian jurisprudence). Expediency, on the other hand, is not a specific legal basis used for determining the validity of an intrusion on human rights. It has also not been referred to in Puttaswamy (I) as a basis of assessing a privacy violation. The use of the term ‘expediency’ in the Bill is deeply worrying as it seems to bring down the threshold for allowing surveillance which is a regressive step in the context of cases like PUCL and Puttaswamy (I). A valid law along with the principles of proportionality and necessity are essential to put in place an effective system of checks and balances on the powers of the executive to provide exemptions. It seems unlikely that the clause will pass the test of proportionality (sanction of law, legitimate aim, proportionate to the need of interference, and procedural guarantees against abuse) as laid down by the Supreme Court in Puttaswamy (I).
The Srikrishna Committee report had recommended that surveillance should not only be conducted under law (and not executive order), but also be subject to oversight, and transparency requirements. The Committee had argued that the tests of lawfulness, necessity and proportionality provided for under clauses 42 and 43 (of the Draft Bill 2018) were sufficient to meet the standards set out under the Puttaswamy judgment. Since the PDP Bill completely does away with all these safeguards and leaves the decision to executive discretion, the law is unconstitutional. After the Bill was introduced in the Lok Sabha, J. Srikrishna had criticised it for granting expansive exemptions in the absence of judicial oversight. He warned that the consequences could be disastrous from the point of view of safeguarding the right to privacy and could turn the country into an “Orwellian State”. He has also opined on the need for a separate legislation to govern the terms under which the government can resort to surveillance.
Clause 36 of the Bill deals with exemption of some provisions for certain processing of personal data. It combines four different clauses on exemption which were listed in the Draft Bill 2018 (clauses 43, 44, 46 and 47). These include processing of personal data in the interests of prevention, detection, investigation and prosecution of contraventions of law; for the purpose of legal proceedings; personal or domestic purposes; and journalistic purposes. The Draft Bill 2018 had detailed provisions on the need for a law passed by Parliament or the State Legislature which is necessary and proportionate, for processing of personal data in the interests of prevention, detection, investigation and prosecution of contraventions of law. Clause 36 of the Bill does not enumerate the need for a law to process personal data under these exemptions. We hadargued that these exemptions granted by the Draft Bill 2018 (clauses 43, 44, 46 and 47) were wide, vague and needed clarifications, but the exemptions under clause 36 of the Bill are even more ambiguous as they merely enlist the exemptions without any specificities or procedural safeguards in place.
In the Draft Bill 2018, the Authority could not give exemption from the obligation of fair and reasonable processing, measures of security safeguards and data protection impact assessment for research, archiving or statistical purposes As per the current Bill, the Authority can provide exemption from any of the provisions of the Act for research, archiving or statistical purposes.
The last addition to this chapter of exemptions is that of creating a sandbox for encouraging innovation. This newly added clause 40 is aimed at encouraging innovation in artificial intelligence, machine-learning or any other emerging technology in public interest. The details of what the sandbox entails other than exemption from some of the obligations of Chapter II might need further clarity. Additionally, to be considered an eligible applicant, a data fiduciary has to necessarily obtain certification of its privacy by design policy from the DPA, as mentioned in clause 40(4) read with clause 22.
Though well appreciated for its intent, this provision requires clarification on grounds of selection and details of what the sandbox might entail.
 At the time of introduction of the PDP Bill 2019, the Minister for Law and Justice of India, Mr. Ravi Shankar Prasad suggested that over 2000 inputs were received on the Draft Bill 2018, based on which changes have been made in the PDP Bill 2019. However, these comments and inputs have not been published by MeitY, and only a handful of comments have been published, by the stakeholders submitting these comments themselves.
Huawei finds support from Indian telcos in the 5G rollout as PayPal withdrew from Facebook’s Libra cryptocurrency project; Foreign Portfolio Investors moved MeitY against in the Data Protection Bill; the CJEU rules against Facebook in case relating to takedown of content globally; and Karnataka joins list of states considering implementing NRC to remove illegal immigrants – presenting this week’s most important developments in law, tech and national security.
[Sep 30] Why the imminent global economic slowdown is a growth opportunity for Indian IT services firms, Tech Circle report.
[Sep 30] Norms tightened for IT items procurement for schools, The Hindu report.
[Oct 1] Govt runs full throttle towards AI, but tech giants want to upskill bureaucrats first, Analytics India Magazine report.
[Oct 3] – presenting this week’s most important developments in law, tech and national security. MeitY launches smart-board for effective monitoring of the key programmes, The Economic Times report.
[Oct 3] “Use human not artificial intelligence…” to keep a tab on illegal constructions: Court to Mumbai civic body, NDTV report.
[Oct 3] India took 3 big productivity leaps: Nilekani, Livemint report.
[Oct 4] MeitY to push for more sops to lure electronic makers, The Economic Times report; Inc42 report.
[Oct 4] Core philosophy of Digital India embedded in Gandhian values: Ravi Shankar Prasad, Financial Express report.
[Oct 4] How can India leverage its data footprint? Experts weigh in at the India Economic Summit, Quartz report.
[Oct 4] Indians think jobs would be easy to find despite automation: WEF, Tech Circle report.
[Oct 4] Telangana govt adopts new framework to use drones for last-mile delivery, The Economic Times report.
[Oct 5] Want to see ‘Assembled in India’ on an iPhone: Ravi Shankar Prasad, The Economic Times report.
[Oct 6] Home market gets attractive for India’s IT giants, The Economic Times report.
[Oct 2] India
Govt requests maximum social media content takedowns in the world, Inc42 report;
Tech Circle report.
Facebook can be forced to delete defamatory content worldwide, top EU court
rules, Politico EU report.
[Oct 4] EU
ruling may spell trouble for Facebook in India, The Economic Times report.
[Oct 4] TikTok,
TikTok… the clock is ticking on the question whether ByteDance pays its content
creators, ET Tech report.
[Oct 6] Why
data localization triggers a heated debate, The Economic Times report.
Sensitive Indian govt data must be stored locally, Outlook report.
Protection and Privacy
[Sep 30] FPIs
move MeitY against data bill, seek exemption, ET markets report,
Financial Express report.
[Oct 1] United
States: CCPA exception approved by California legislature, Mondaq.com report.
[Oct 1] Privacy
is gone, what we need is regulation, says Infosys Kris Gopalakrishnana, News18 report.
Europe’s top court says active consent is needed for tracking cookies, Tech
[Oct 3] Turkey
fines Facebook $282,000 over data privacy breach, Deccan Herald report.
Singapore’s ‘fake news’ law to come into force Wednesday, but rights group
worry it could stifle free speech, The Japan Times report.
Minister says Singapore’s fake news law is about ‘enabling’ free speech, CNBC report.
[Oct 3] Hong
Kong protests: Authorities to announce face mask ban, BBC News report.
[Oct 3] ECHR:
Holocaust denial is not protected free speech, ASIL brief.
[Oct 4] FIR
against Mani Ratnam, Adoor and 47 others who wrote to Modi on communal
violence, The News Minute report;
Times Now report.
[Oct 5] UN asks
Malaysia to repeal laws curbing freedom of speech, The New Indian Express report.
[Oct 6] When
will our varsities get freedom of expression: PC, Deccan Herald report.
[Oct 6] UK
Government to make university students sign contracts limiting speech and
behavior, The Times report.
[Oct 7] FIR on
Adoor and others condemned, The Telegraph report.
[Sep 30] Plea
in SC seeking linking of social media accounts with Aadhaar to check fake news,
The Economic Times report.
[Oct 1] Why
another omnibus national ID card?, The Hindu Business Line report.
[Oct 2] ‘Kenyan
court process better than SC’s approach to Aadhaar challenge’: V Anand, who
testified against biometric project, LiveLaw report.
[Oct 3] Why
Aadhaar is a stumbling block in Modi govt’s flagship maternity scheme, The
Parliament panel to review Aadhaar authority functioning, data security, NDTV report.
The deadline to link PAN cards with Aadhaar was extended to December 31 this week; the Election Commission ruled that voting rights of those excluded in the NRC process remain unaffected; the Home Minister proposed a digital census with multipurpose ID cards for 2021; and 27 nations including the US, UK and Canada issued joint statement urging for a rules-based order in cyberspace – presenting this week’s most important developments in law, technology and national security.
Aadhaar and Digital IDs
[Sep 23] Home Minister announces
digital census in 2021, proposed multipurpose ID card, Entrackr report; Business Today report.
[Sep 24] NRIs can now apply for
Aadhaar on arrival without 182-day wait, The Economic Times report.
[Sep 24] Aadhaar will be linked
to driving license to avoid forgery: Ravi Shankar Prasad, The Indian Express report.
[Sep 24] One nation, one card?
Amit Shah floats idea of all-in-one ID; here are all the problems with that
idea, Medianama report; Money Control report.
[Sep 24] Explained: Is India
likely to have a multipurpose national ID card? The Indian Express report.
[Sep 24] UIDAI nod to ‘voluntary’
use of Aadhaar for National Population Register rollout, The Economic Times report.
[Sep 24] Govt must decide on
Aadhaar-social media linkage:SC, Deccan Herald report.
[Sep 25] New law needed for
Aadhaar-social media linkage: UIDAI, The Economic Times report; Inc42 report.
[Sep 26] NPR process to include
passport, voter ID, Aadhaar and other details, Business Standard report.
[Sep 27] Gang involved in making
fake Aadhaar cards busted, The Tribune report.
[Sep 27] What will happen if you
don’t link your PAN card with Aadhaar by Sep 20, The Quint report.
[Sep 27] Explained: The National
Population Register, and the controversy around it, The Indian Express report.
[Sep 27] Aadhaar to weed out
bogus social security beneficiaries in Karnataka, Deccan Herald report.
[Sep 29] Bajrang Dal wants
Aadhaar mandatory at dandiya to keep ‘non-Hindus’ out, The Hindustan Times report; The Wire report.
[Sep 30] Kerala urges Centre to
extend deadline to link ration cards with Aadhaar, The News Minute report.
[Sep 30] PAN-Aadhaar linking
deadline extended to December 31, The Economic Times report.
[Sep 25] India’s regulatory
approach should focus on the regulation of the ‘core’: IAMAI, Livemint report.
[Sep 27] India may have to offer
sops to boost electronic manufacturing, ET Tech report; Inc42 report.
[Sep 27] Digital India, start-ups
are priorities for $5 trillion economy: PM Modi, Medianama report.
[Sep 29] Tech giants aim to skill
Indian govt officials in AI, cloud, ET CIO report.
[Sep 29] India’s share in IT,
R&D biz up in 2 years: report, The Economic Times report.
[Sep 24] Supreme Court to MeitY:
What’s the status of intermediary guidelines? Tell us by Oct 15, Medianama report.
[Sep 26] Will not be ‘excessive’
with social media rules, ay Govt officials, Inc42 report.
[Sep 26] Government trying to
balance privacy and security in draft IT intermediary norms, The Economic Times
[Sep 27] Citizens, tech companies
served better with some regulation: Facebook India MD Ajit Mohan, ET Tech report; Inc42 report.
[Sep 27] Balance benefits of
internet, data security: Google CEO Sundar Pichai, ET Tech report; Business Today report.
[Sep 25] Jadavpur University
calls upon ‘stakeholders’ to ensure free speech on campus, The New Indian
[Sep 28] RSS raises objections to
uncensored content of Maoj Bajpayee’s “The Family Man”, The Hindu report; Outlook report.
Privacy and Data Protection
[Sep 23] A landmark decision on
Tuesday could radically reshape how Google’s search results work, Business
[Sep 23] Google tightens its
voice assistant rules amidst privacy backlash, Wired report.
[Sep 24] Dell rolls out new data
protection storage appliances and capabilities, ZDNet report.
[Sep 24] ‘Right to be forgotten’
privacy rule is limited by Europe’s top court, The New York Times report; Live Law report.
[Sep 27] Nigeria launches
investigation into Truecaller for potential breach of privacy, Medianama report.
[Sep 29] Right to be forgotten
will be arduous as India frames data protection law, Business Standard report.
[Sep 30] FPIs move against data
bill, seek exemption, ET Telecom report; Entrackr report.
[Sep 26] Reconsider imposition of
data localisation: IAMAI report, The Economic Times report.
[Sep 27] Why data is not oil:
Here’s how India’s data localisation norms will hurt the economy, Inc42 report.
Digital Payments and Fintech
[Sep 23] RBI rider on credit
bureau data access has Fintech in a quandary, ET Tech report.
Cybersecurity experts warned of a new ‘SIM jacking’ threat, the Kerala High Court recognizes a right to access internet as the internet shutdown in Kashmir entered its 50th day; more updates on the linkage of Aadhaar with voter IDs and social media as the Indian Army braces itself to adopt AI – presenting this week’s most important developments in law, tech and national security.
[Sep 16] Here are the amendments
the Election Commission wants to the Representation of the People Act for
Aadhaar-Voter ID linkage, Medianama report.
[Sep 18] Why Maj. Gen. Vombatkere
has challenged Aadhaar Amendment Act in the Supreme Court; On WhatsApp and
traceability, Medianama report.
[Sep 19] Drop in Aadhaar
enrolments in J&K, The Economic Times report.
[Sep 20] In-principle decision to
link Aadhaar with GST registration, The Economic Times report.
[Sep 23] Aadhaar card is now
mandatory for nominees of your EPF account, Livemint report.
[Sep 18] Indo-US ICT working group to meet on Sept 30, Oct 1, Medianama report.
[Sep 17] NITI Aayog frames guidelines for automated inspection of vehicles, ET Auto report.
[Sep 17] What TikTok told MEITY about its intermediary status, data collection, and policies for children, Medianama report.
[Sep 18] Soon, lands will have Aadhaar-like unique numbers, The Economic Times report; Business Today report.
[Sep 18] Drones to be used to digitally map India: report, Medianama report.
[Sep 18] PMO panel to release policy to boost handset manufacturing in India: report, Medianama report.
[Sep 19] Karnataka to set up exclusive body to boost innovation, The Hindu report.
[Sep 20] ‘Right To Access Internet Is Part Of Right To Privacy And Right To Education’: Kerala HC, Live Law report; Hindu report; NDTV report.
Data Protection and Privacy
[Sep 15] Privacy debate between govt, Facebook continues; no winner yet, Money Control report.
[Sep 16] Singapore, Philippines sign MoU on personal data protection, The Manila Times report.
[Sep 16] Industry wants careful drafting of regulations on non-personal data, The Economic Times report.
[Sep 16] Here are the top three reasons why data protection is required in every business, Firstpost report.
[Sep 20] Sensitive, super-sensitive data must be stored locally in india: RS PRasad, Business Standard report.
[Sep 20] Yet another data leak in Indian government database, exoposes multiple citizen IDs, Inc42 report.
[Sep 22] Infosys co-founder Kris Gopalakrishnan to lead panel on protection of non-personal data, Financial Express report.
[Sep 16] Odisha government makes
e-marketplace mandatory for procurements, The New Indian Express report.
This week, Delhi International Airport deployed facial recognition on a ‘trial basis’ for 3 months, landline communications were restored in Kashmir as the Government mulls over certification for online video streaming platforms like Netflix and PrimeVideo – presenting this week’s most important developments in law, tech and national security.
[Sep 3] PAN will be issued
automatically using Aadhaar for filing returns: CBDT, DD News report.
[Sep 3] BJD set to collect Aadhaar
numbers of its members in Odisha, Opposition parties slam move, News 18 report; The New Indian Express report; Financial Express report.
[Sep 5] Aadhaar is secure, says
ex-UIDAI chief, Times of India report.
[Sep 5] Passport-like Aadhaar centre
opened in Chennai: Online appointment booking starts, Livemint report.
[Sep 8] Plans to link Janani Suraksha
and Matra Vandan schemes with Aadhaar: CM Yogi Adityanath, Times of India report.
[Sep 5] Digital media bodies welcome
26% FDI cap, Times of India report.
[Sep 6] Automation ‘not threat’
to India’s IT industry, ET Tech report.
[Sep 6] Tech Mahindra to modernise
AT&T network systems, Tech Circle report.
Data Protection and Governance
[Sep 2] Health data comes under the
purview of Data Protection Bill: IAMAI, Inc42 report.
[Sep 2] Credit history should not be
viewed as sensitive data, say online lenders, Livemint report.
[Sep 3] MeitY may come up with policy
on regulation of non-personal data, Medianama report.
[Sep 3] MeitY to work on a white paper
to gain clarity on public data regulations, Inc42 report.
[Sep 6] Treating data as commons is
more beneficial, says UN report, Medianama report.
[Sep 9] Indian Government may allow
companies to sell non-personal data of its users, Inc42 report, The Economic Times report.
[Sep 9] Tech firms may be compelled to
share public data of its users, ET Tech report.
Data Privacy and Breaches
[Sep 2] Chinese face-swap app Zao faces
backlash over user data protection, KrAsia report; Medianama report.
[Sep 2] Study finds Big Data eliminates
confidentiality in court judgments, Swiss Info report.
[Sep 4] YouTube will pay $170 million
to settle claims it violated child privacy laws, CNBC report; FTC Press Release.
[Sep 4] Facebook will now let people
opt-out of its face recognition feature, Medianama report.
[Sep 4] Mental health websites in
Europe found sharing user data for ads, Tech Crunch report.
[Sep 5] A huge database of Facebook
users’ phone numbers found online, Tech Crunch report.
[Sep 5] Twitter has temporarily
disabled tweet to SMS feature, Medianama report.
[Sep 6] Fake apps a trap to track your
device and crucial data, ET Tech report.
[Sep 6] 419 million Facebook users
phone numbers leaked online, ET Tech report; Medianama report.
[Sep 9] Community social media
platform, LocalCircles, highlights data misuse worries, The Economic Times report.
[Sep 7] Freedom of expression is not
absolute: PCI Chairman, The Hindu report.
[Sep 7] Chennai: Another IAS officer
resign over ‘freedom of expression’, Deccan Chronicle report.
[Sep 8] Justice Deepak Gupta: Law on
sedition needs to be toned down if not abolished, The Wire report.
Online Content Regulation
[Sep 3] Government plans certification
for Netflix, Amazon Prime, Other OTT Platforms, Inc42 report.
[Sep 4] Why Justice for Rights went to
court, asking for online content to be regulated, Medianama report.
[Sep 4] Youtube claims new hate speech
policy working, removals up 5x, Medianama report.
[Sep 6] MeitY may relax norms on
content monitoring for social media firms, ET Tech report; Inc42 report; Entrackr report.
[Sep 4] Offline retailers accuse Amazon
and Flipkart of deep discounting, predatory pricing and undercutting, Medianama
report; Entrackr report.
[Sep 6] Companies rely on digital
certification startups to foolproof customer identity, ET Tech report.
Digital Payments and FinTech
[Sep 3] A sweeping reset is in the
works to bring India in line with fintech’s rise, The Economic Times report.
[Sep 3] Insurance and lending companies
in agro sector should use drones to reduce credit an insurance risks: DEA’s
report on fintech, Medianama report.
MeitY sought views on ‘non-personal data’; India and France announce joint research consortium on AI and digital partnership after NSA-level talks; Section 144 CrPC imposed in areas of Assam anticipating unrest after the publication of the NRC list as the MHA holds a high-level security meet on Kashmir; and the tussle between MeitY and the Niti Aayog for control over the Rs. 7000 cr AI project continues – presenting this week’s most important developments at the intersection of law and tech.
[Aug 27] Aadhaar integration can weed
out fake voters: UIDAI’s Ajay Bhushan Pandey, Business Standard report.
[Aug 27] Government to intensify
Aadhaar enrolment in J&K after Oct 31: Report, Medianama report; Times Now report; The Quint report
[Aug 27] Interview: Why I filed a case
to link Aadhaar and Social Media Accounts, The Quint report.
[Aug 27] Aadhaar database cannot be
hacked even after a billion attempts: Ravi shankar Prasad, Money Control report.
[Aug 27] Most dangerous situation:
Justice Srikrishna on EC-Aadhaar linking, The Quint report.
[Aug 28] Aadhaar ads to women’s
problems in India. Here’s why. The Wire report.
[Aug 28] What Centre will tell Supreme
Court on Aadhaar and social media account linkage, The Hindustan Time report.
[Aug 28] All residents of an MP village
have the same date of birth on their Aadhaar, Business Standard report.
[Aug 29] Blood banks advised to ask for
donors’ Aadhaar cards, Times of India report.
[Aug 29] Aadhaar continues to evolve
and grow as India issues biometric seafarers’ ID, Biometric Update report.
[Aug 31] Aadhaar mandatory for farmers
to avail crop loan in Odisha, Odisha Sun Times report.
[Sep 1] NRIs to get Aadhaar sans
180-day wait in 3 months, The Hindu report.
[Sep 1] Aadhaar-liquor link to check
bottle littering? Deccan Herald report.
[Sep 1] Linking Aadhaar with social
media can lead to insidious profiling of people, says Apar Gupta, Times of
[Aug 27] NASSCOM-DSCI on National
Health Stack: separate regulatory body for health, siloed registries, usage of
single ID, Medianama report.
[Aug 27] Govt looks to develop
electronics component manufacturing base in India: MeitY Secretary, YourStory report; Money Control report.
[Aug 30] India is encouraging foreign
firms to shift biz from China: report, Medianama report; Reuters report.
[Aug 30] Wipro, Google to speed up
digital shift of enterprises, ET Telecom report.
[Aug 30] Government committed to reach
public via technology, Times of India report.
The Bill envisages the setting up of a DNA profiling board which shall function as the regulatory authority and lay down guidelines, standards and procedures for the functioning of DNA laboratories and grant them accreditation. The board will also assist the government in setting up new data banks and advise the government on “all issues relating to DNA laboratories”. In addition, it will make recommendations on legislation and practices relating to privacy issues around storage and access to DNA samples.
DNA data banks will also be established, consisting of a national data bank as well as the required number of regional data banks. Regional data banks must mandatorily share all their information with the national data bank. Every data bank shall maintain databases of five categories of data – crime scenes, suspects or undertrials, offenders, missing persons, and unknown deceased persons.
The 2017 draft has made significant changes to address concerns raised about the previous 2015 draft. These include removing the index of voluntarily submitted DNA profiles, deleting the provision allowing the DNA profiling board to create any other index as necessary, detailing serious offences for DNA collection, divesting the database manager of discretionary powers, and introducing redressal mechanisms by allowing any aggrieved person to approach the courts. Additionally, it has added legislative provisions authorising licensed laboratories, police stations and courts to collect and analyse DNA from certain categories of people, store it in data banks and use it to identify missing/ unidentified persons and as evidence during trial.
The new Bill has attempted to address previous concerns by limiting the purpose of DNA profiling, stating that it shall be undertaken exclusively for identification of a person and not to extract any other information. Safeguards have been put in place against misuse in the form of punishments for disclosure to unauthorised persons.
The Bill mandates consent of an accused before collection of bodily substances for offences other than specified. However, any refusal, if considered to be without good cause, can be disregarded by a Magistrate if there is reasonable cause to believe that such substances can prove or disprove guilt. Any person present during commission of a crime, questioned regarding a crime, or seeking a missing family member, may volunteer in writing to provide bodily substances. The collection of substances from minors and disabled persons requires the written consent of their parents or guardians. Collection from victims or relatives of missing persons requires the written consent of the victim or relative. Details of persons who are not offenders or suspects in a crime cannot be compared to the offenders’ or suspects’ index, and any communication of details can only be to authorised persons.
Areas of Concern
Although the Bill claims that DNA testing is 99.9% foolproof, doubts have recently been raised about the possibility of a higher error rate than previously claimed. This highlights the need for the proposed legislation to provide safeguards in the event of error or abuse.
The issue of security of all the data concentrated in data banks is of paramount importance in light of its value to both government and private entities. The Bill fails to clearly spell out restrictions or to specify who has access to these data banks.
Previous iterations of the Bill have prompted civil society to express their reservations about the circumstances under which DNA can be collected, issues of consent to collection, access to and retention of data, and whether such information can be exploited for purposes beyond those envisaged in the legislation. As in the case of Aadhaar, important questions arise regarding how such valuable genetic information will be safeguarded against theft or contamination, and to what extent this information can be accessed by different agencies. The present Bill has reduced the number of CODIS loci that can be processed from 17 to 13, thus restricting identification only to the necessary extent. However, this provision has not been explicitly stated in the provisions of the legislation itself, casting doubt over the manner in which it will be implemented.
Written consent is mandatory before obtaining a DNA sample, however withholding of consent can be overruled by a Magistrate if deemed necessary. An individual’s DNA profile can only be compared against crime scene, missing person or unknown deceased person indices. A court order is required to expunge the profile of an undertrial or a suspect, whose profile can also be removed after filing of a police report. Any person who is not a suspect or a convicted offender can only have their profile removed on a written petition to the director of the data bank. The consent clause is also waived if a person has been accused of a crime punishable either by death or more than seven years in prison. However, the Bill is silent on how such a person’s profile is to be removed on acquittal.
Moreover, the Bill states that “the information contained in the crime scene index shall be retained”. The crime scene index captures a much wider data set as compared to the offenders’ index, since it includes all DNA evidence found around the crime scene, on the victim, or on any person who may be associated with the crime. The indefinite retention of most of these categories of data is unnecessary, as well as contrary to earlier provisions that provide for such data to be expunged. However, the government has claimed that such information will be removed “subject to judicial orders”. Importantly, the Bill does not contain a sunset provision that would ensure that records are automatically expunged after a prescribed period.
While the Bill provides strict penalties for deliberate tampering or contamination of biological evidence, the actual mechanisms for carrying out quality control and analysis have been left out of the parent legislation and left to the purview of the rules.
Crucially, the Bill has not explicitly defined privacy and security protections such as implementation of safeguards, use and dissemination of genetic information, security and confidentiality and other privacy concerns within the legislation itself – leaving such considerations to the purview of regulation (and out of parliamentary oversight). The recently released Personal Data Protection Bill, 2018 does little to allay these concerns. As per this Bill, DNA Banks will be classified as significant data fiduciaries, and thus subject to audits, data protection impact assessments, and appointment of a special data protection officer. However, although genetic information is classified as sensitive personal data, the Data Protection Bill does not provide sufficient safeguards against the processing of such data by the State. In light of the proposed data protection framework, and the Supreme Court confirming that the right to privacy (including the right to bodily integrity) is a fundamental right, the DNA Profiling Bill as it stands in its present form cannot be implemented without violating the fundamental right to privacy.