Digitisation of Health / Medical Records: Is the law keeping up?

By Smitha Krishna Prasad

Medical and health records are increasingly digitised, and ease of access is considered one of the key benefits of this trend. However, patient privacy and security of such records are important concerns that need to be addressed both under the existing legal framework, and in terms of development of new laws.

Earlier this month, news reports suggested that private medical records of over 35000 patients had been made publicly available through the website of a diagnostic laboratory based in Mumbai. Reports indicate that the website of the lab was hacked. However, other reports specify that the lab has disclaimed liability, stating that any requirement for confidentiality is limited in applicability to doctors only. Further, the lab suggested that since they were shortly to be moving to a different system, there was no urgency in remedying the security flaws.

While the above seems to be an internal security issue on the part of the lab, we have seen that health records are a favourite for hackers, across the world. These records are then either held for ransom or sold by such hackers.

The healthcare industry as a whole is seen as one of the least secure industries globally. At the same time, medical and health records of individuals are increasingly being digitised. Individuals and institutions in the healthcare industry are digitising records within their organisations to improve ease of access. The Ministry of Health and Family Welfare, Government of India, is in the process of setting up an Integrated Health Information Platform, and has issued Electronic Health Record Standards (EHR Standards). The EHR Standards are meant to provide for creation and maintenance of health records in a standardised manner that would allow for interoperability across platforms and institutions across the country. There are many pros and cons to undertaking such a digitisation effort – however, this post is limited to examining the legal framework surrounding such digitisation and the protection of privacy of patients.

Current Legal Framework in India

Today, India does not have a comprehensive privacy law, or an industry specific privacy regulation that focuses on the healthcare / medical industry. We do have the Information Technology Act, 2000 (“IT Act”), and the Information Technology (Reasonable security practices and procedures and sensitive personal data or information) Rules, 2011 (“IT Rules”), as well as the Indian Medical Council (Professional conduct, Etiquette and Ethics) Regulations, 2002 (“MCI Code of Ethics”).

The MCI’s Code of Ethics provides that physicians must maintain medical records pertaining to patients for a period of 3 years from commencement of treatment. Further, physicians must also make such records available to patients, authorised attendants and legal authorities upon request. Physicians are also required to make efforts to computerise such records. While there is no specific provision on maintenance of privacy and security of these medical records, the MCI Code of Ethics does provide that confidences entrusted by patients to physicians must be not be revealed, unless required by law or in public interest. However, the MCI Code of Ethics is applicable to physicians i.e. doctors with MBBS or equivalent qualifications only.

On the other hand, the IT Act and the IT Rules are wider in application. They deal specifically with electronic records and require any person dealing with certain defined types of sensitive information, including medical records, to undertake data protection and security measures.

Any violation of the MCI Code of Ethics calls for disciplinary action against the concerned physician which could include removal of the physician’s name from the register of qualified physicians. The IT Act however, does not provide for any direct action or penalty in the case of non-compliance with the IT Rules, and relies on the person affected by the non-compliance to take action.

In addition to the MCI Code of Ethics and the IT Act, there are a few other laws such as the Medical Termination of Pregnancy Act, 1971 which provide for maintenance of confidentiality of patient information. However, these are largely specific to certain circumstances and are not comprehensive.

Potential Developments

In the absence of a comprehensive privacy and data protection law in India, some regulators have taken to establishing basic rules to protect consumers and individuals in their respective industries. For instance, the RBI places certain restrictions on the circumstances in which customer information can be shared by banks. Insurance and telecom companies are restricted from transferring certain customer information outside India.

Given the highly sensitive nature of medical / health related information, and recent trends of commoditisation of such information in the black market, such laws are much needed in the healthcare industry.

The EHR Standards do deal with certain aspects of privacy of patients and security of healthcare records. They prescribe several international standards to be adhered to by members of the healthcare industry while dealing with electronic health records. However, they appear to default back to the IT Act as the legislation that would govern the implementation of any data protection measures in relation to such records.

The Human Immunodeficiency Virus and Acquired Immune Deficiency Syndrome (Prevention and Control) Bill, 2014 also provides certain safeguards to ensure the privacy of patients, specifically in relation to their HIV status. Some concerns regarding the provisions of this bill have previously been discussed here. However, this proposed bill is again limited in scope, and does not apply across the medical industry.

Reports suggest that recognising the need for a more comprehensive law, the Central Government has taken up the initiative of drafting a healthcare industry specific privacy and data protection law.

Given that this law would be drafted from scratch, we suggest that it should be (a) holistic i.e. be applicable across the entire healthcare / medical industry, and not specifically to doctors / hospitals, and (b) technology agnostic, addressing medical / health information in any format, digitised or not.

The law should also take into account the internationally recognised privacy / fair information principles. These principles provide, among other things, for (a) collection of data by lawful means, and only when required (b) use of data for the purpose it is collected only, (c) adequate security measures to be undertaken to protect data, and (d) accountability and openness about policies in place for use and protection of data.

Further, to the extent that it provides for the digitisation of records, and implementation of EHR Standards, it should be ensured that, the principles of ‘privacy by design’ should be used. The concept of privacy by design stipulates that privacy and data protection measures must be built into any system as a default, taking a preventative approach to data protection rather than a remedial approach.

Another important concern is enforcement – our current laws such as the IT Act, do not provide for proactive enforcement in case of failure to protect privacy / data of individuals, and leave it up to the affected individuals to act. Ideally, a dedicated regulator with the ability to investigate and direct action against defaulters is required. Perhaps the role of the National e-Health Authority proposed by the Government could be expanded to deal with privacy and security of all health records and information.

While the idea of implementing a health privacy and data protection law is a welcome move, it remains to be seen how far this proposed legislation will go towards fully protecting patients’ rights.

Google Faces Legal Hurdles Under Brazilian Internet Law

By Raissa Campagnaro[1]

The Brazilian Federal Prosecution Ministry has brought civil proceedings against Google for flouting its data protection law. The suit challenges Google’s access to the content of emails exchanged by Gmail users on multiple grounds, including Google’s failure to obtain express consent.

In October, 2016, Brazil’s Federal Prosecutor filed a public civil suit against Google, claiming that the search engine had failed to comply with the country’s internet law, the Internet Bill of Rights. The suit argues that during a previous prosecution investigation, through a civil inquiry, Google had made it public that it scans the content of emails exchanged by Gmail users. According to the Federal Prosecutor, this violates Brazilian data protection standards.

The Internet Bill of Rights establishes data protection principles similar to those set up under the EU Data Protection Directive 95/46/EC. Under this law, any processing of data must be pursuant to express consent. The law specifically requires that the clause seeking consent be prominently displayed and easy to identify amongst other terms of the contract. The law also recognises a right to not have one’s data transferred to third parties without consent and a right to be informed about the specific purposes of the personal data collection, usage, storage, treatment and protection.

When asked about its compliance with the legislation, Google submitted that it analyses the email messages so it can improve consumers’ user experience by filtering the messages for unwanted content, spam, or other kind of malware. It also submitted that the scanning of messages is used to offer products and advertisement for the user and to classify emails into various categories such as ‘social’ ‘promotions’ etc. Finally, Google has contended that the scanning of emails is  consented to by the user at the time of signing up, by agreeing to the privacy policy within Gmail’s terms of service.

However, the Federal Prosecution Ministry considers these practices to be ‘profiling’ – a consequence of personal data aggregation that allows the creation of users’ profiles based on their behaviour, online habits and preferences. These can be used to predict their future actions and decisions. Profiling is frequently used for behavioural advertisements in which aggregated personal data is transferred to other ISPs, who use it to direct ads, products and services determined by the person’s past online activity. According to the Federal Prosecutor, this not only violates people’s right to privacy, especially their informational self-determination right, but also interferes with a consumer’s freedom of choice.

Several scholars and researchers have also opposed profiling and behavioural advertising, arguing that it has severe negative consequences. These include (i) denial of credit or loan concessions; (ii) offering different health insurance deals based on a person’s medical history or the nature of activities they engage in; and (iii) offers with adaptive pricing, based on a variety of criteria that involve some level of discrimination. This is problematic because online profiles are limited. A person’s life is based on several aspects apart from the online information which is collected and aggregated. As a result, personal data aggregation, processing and analysis can lead to an incomplete or incorrect picture of an individual, leading to wrongful interventions in their life. Even if the profile is a complete reflection of a person’s life, the choice to have one’s data collected and used for determined purposes must always be the users’.

The suit alleges that Google’s practices are not in consonance with the legal requirement of seeking express consent, including through prominent display within a policy. It suggests that Google be required to take specific consent in order to access the content of emails.

The case also  challenges the fact that Google’s privacy policy does not allow consumers to withdraw consent. This violates consumers’ control over their data. Further, it is also argued that consent should be sought afresh every time Google changes its privacy policy. The lack of clear and precise information around how data is processed is another issue that has been pointed out in the case, violating the right of Gmail users to information regarding the usage of their data.

To substantiate its case, the Federal Prosecutor is relying on an Italian case in which Google’s data processing activities had been challenged. The ruling was based on Italy’s Data Privacy Code, which establishes data protection guarantees such as i) fair and lawful processing of data; ii) specific, explicit and legitimate purposes and use of data; iii) processing to not be excessive in relation to the purposes for which it is collected or subsequently processed; and iv) that the data must only be kept for the amount of time truly necessary. In addition, the law stipulates that a data subject must receive notice about how their data will be processed, allowing them to make an informed decision. Furthermore, the Italian code also requires consent to be express and documented in writing.

In 2014, Garante’s (i.e. the Italian Data Privacy Authority, furthermore “the Authority”) decision held that Google had failed to comply with some requirements under the Italian legislation. Firstly, the information given by Google around how data processing was carried out was considered insufficient, as it was too general. Secondly, the consent format given through the privacy policy agreement was also held to be too broad. The Authority held that consent should be prior and specific to the data treatment. Although the decision condemned the company’s practices, it did not establish any guidelines for Google to adopt in this regard.

Through the present suit, the Brazilian Federal Prosecutor seeks (i) suspension of Google’s email content analysis, that is, scanning of emails of Gmail users where express consent has not been received ; (ii) an obligation to obtain express and consent from users before scanning or analysing the content of emails and (iii) ensuring the possibility of consent withdrawal. The suit seeks an order directing Google to change its privacy policy to ensure consent is informed and particular to content analysis.

This case demonstrates a new aspect of data protection concern. Apart from the most common cases over data breach situations, where the damage is usually too late or too massive to repair, the Brazilian and the Italian cases are great examples of proactive measures taken to minimise  future risks. Further, the importance of a legal framework that utilises data protection principles to guarantee consumers’ right to privacy is well recognised. Now, it appears that these rules are starting to be more effectively enforced and, in consequence, the right to privacy can be observed in practice.

[1] Raissa is a law student from Brazil with an interest in internet law and policy. Raissa has been interning with the civil liberties team at CCG for the past month.

“The Right to be Forgotten”: Balancing Personal Privacy with the Public’s right to access Information

Evolution of the right and Global framework

In the Internet age, when access to information is quick and easy, procuring personal information or past records about an individual is no longer a herculean task. The relevance of such information or the duration for which such data should be available for public access has hitherto not been debated.

There is growing global debate on a new right called “the right to be forgotten” or “the right of erasure”. This right allows people to request for removal of their personal information/data online after a period of time or if such information/data is no longer relevant. The origin of this right can be traced back to the French jurisprudence on the ‘right to oblivion’ or droit à l’oubli. The rationale behind this right was to allow criminal offenders who have already served their sentence to object to the publication of information regarding their crime and conviction. This was done to ease their process of social integration.

It was along these lines that the 1995 EU Data Protection Directive acknowledged the right to be forgotten. Under the Directive, it was stipulated that the member states should give people the guaranteed right to obtain from the ‘controller’ the rectification, erasure or blocking of data relating to them, the processing of which does not comply with the provisions of the Directive. The term ‘controller’ here refers to a natural or legal person, public authority, agency or any other body which alone or jointly determines the purposes and means of processing personal data.

In May 2014, the Court of Justice of the European Union (‘Court’) recognized the right to be forgotten as a part of the fundamental right to privacy in the Google case. The plaintiff, in this case, had requested for delinking of search results appearing on Google and the deletion of newspaper articles appearing online with respect to bankruptcy proceedings against him. The Court held that individuals have a right to request search engines to delink information which causes prejudice to them. However, the Court was careful to state that this right is not absolute and can be applied only when the data becomes ‘inadequate, irrelevant, excessive, not kept up to date, or kept for longer than necessary’ with respect to the purpose for which it was collected or processed. Accordingly, the Court directed Google to delink the search results in the instant case. It was further held that the publication of accurate data may be lawful at a given point in time, but in due course, it might become inconsistent with the law.

While the judgment in the Google case is a step in the right direction, it leaves much to be desired. The Court did not set out any guidelines or parameters to filter out information as ‘inadequate’ or ‘irrelevant’ or ‘excessive’. It has thrust the onerous task of balancing the right to privacy of an individual and the public’s right to access information on private search engines like Google. This raises critical questions regarding the suitability of private entities taking decisions which are of constitutional import. Pursuant to this judgment, the EU adopted the Data Protection Reforms which includes the right to be forgotten as an essential right under Article 17 of the Data Protection Regulations. This lays down the conditions for application of the right to be forgotten, and requires entities processing personal data to inform third parties regarding requests for erasure of links to any personal data. A detailed discussion of these regulations and their impact on India can be found here.

Challenges in enforcement

There are many legal and technical challenges in the enforcement of the right to be forgotten. The success rate of governments across the world in banning or removing pornographic websites or torrent sites from the Internet has not been great, since there are various ways of circumventing such bans. Further, the blocking or delinking of URLs by search engines does not guarantee that such information has been blocked or deleted from the Internet. There is also no way to ensure that such information is not uploaded again.

To enforce the ruling of the case discussed above, Google has created a mechanism through which an individual can make a request for taking down of or delinking of a specific search result bearing an individual’s name. Google evaluates such requests on various parameters like whether these results are an infringement on his right to privacy or whether such information is of public interest. In case of the former, the individual’s right to be forgotten trumps the public’s right to access information. However, if the information is of public interest, the right to information of the public prevails over privacy rights. This squarely makes Google the decision maker of the relevance, adequacy, and need for data to be available online for public access or not.

With the growing recognition of the right to be forgotten, the number of requests that search engines receive for taking down or delinking is only likely to increase, making it extremely difficult and cumbersome to scrutinize such requests manually. According to Google’s Transparency Report, as on 9th October, 2016, Google had received 565,412 requests for the removal of URLs. The Report further states that it has already evaluated 1,717,714 URLs since May, 2014. The Report shows that Google has removed 43.2% of the URLs from the requests received. With a substantial increase in the number of requests, search engines may even consider using algorithms to deal with such requests instead of manually evaluating the privacy rights vis-à-vis public interest.

Further, search engines are also likely to tread on the side of caution and accept such requests rather than face expensive legal challenges across jurisdictions for non-compliance. This right may be misused by individuals as it will lead to artificial alteration of the content available online which may result in the delinking of pertinent information.

Recent developments in India

The data protection regime and data privacy laws of India are not comprehensive and dynamic enough to respond to technological advances in the modes of collection, transfer and use of personal information. The Information Technology Act, 2000 and the rules framed under the Act make up the primary legal framework that governs this subject. The Delhi High Court is currently hearing a matter (Laksh Vir Singh Yadav vs. Union of India, WP(C) 1021/2016) where the petitioner has requested for the removal of a judgment involving his mother and wife from an online case database. The petitioner claims that the appearance of his name in the judgment is causing prejudice to him and affecting his employment opportunities. It will be interesting to see the outcome of this case and how the larger debate of the right to privacy of an individual versus the right of public to access information unfolds in this case.

It is pertinent to note that the Delhi High Court is dealing with the request for removal of a court order which is a public document. This request is unusual and distinct from a request for delinking of search results appearing in search engines like Google since such delinking does not result in the removal of the information itself. Allowing the removal of such judgments from online case databases could result in the expunging of public records. Furthermore, the removal of judgments from online public databases will obstruct public access to case materials shedding light on critical questions of law.

While implementing the right to be forgotten, a very fine balance has to be struck between the right to freedom of speech and expression, public interest and personal privacy. To balance these conflicting rights, the judiciary may consider implementing a system where personal information like names, addresses etc. of the litigants are redacted from reportable judgments/orders especially in personal disputes. The courts have, in the past, refrained from divulging the identities of parties in order to respect their privacy in many rape or medico-legal cases.

With many unanswered questions surrounding this right, India requires a comprehensive data protection regime to regulate the entities collecting and processing personal data and to define the terms of use, storage and deletion of such personal data. This will ensure that such entities are obliged to take due care of the personal data in their possession and will also provide a framework for dealing with requests for removal or erasure of such personal data.

Aadhaar – Identity without Consent, Control or Security

The Central Government notified certain sections of the Aadhaar (Targeted Delivery of Financial and Other Subsidies, Benefits and Services) Act, 2016 (‘the Act’) last month. As of now, only the sections setting out the establishment, powers and functions of the Unique Identification Authority of India (UIDAI) have been brought into force.

Passed earlier this year, the Act is aimed at more efficient delivery of government subsidies and services by eliminating ghost identities and reducing corruption. It does so by obtaining and storing, in a centralized database, biometric and demographic information of all Indians who have been residing in India for more than one hundred and eighty-two days. This database, called the Central Identities Data Repository (CIDR), not only stores information parted with at the time of enrolment, but also keeps a record of every identification request sent to it. Thus, every time a resident is required to authenticate her identity by any service provider, the CIDR would maintain a record of it.  Significantly, (and contrary to three previous Supreme Court orders), there is little room for doubt that the scheme has been envisaged as being mandatory to avail the benefits attached to it.

If the lack of an overarching privacy law wasn’t enough reason to worry, the government’s submission before the Supreme Court that there is no fundamental right to privacy has raised legitimate concerns about the project and its implementation. A lot has been written about the problems with the Act and the larger scheme itself. But two aspects of the privacy debate under Aadhaar deserve urgent attention. First, as a mandatory scheme with no option to opt-out at a later stage, Aadhaar raises important issues of consent and one’s right to control the use of their personal information. This has famously been articulated as ‘informational self-determination’ in several European jurisdictions. The second concern is procedural and pertains to the method of collection and storage of sensitive personal information.

No Power to Consent or Opt-out

Biometric information such as fingerprints and iris scans form a core part of one’s bodily integrity. A requirement to part with such information as a condition precedent to availing essential services undermines basic constitutional values. While the enrolment form has a checkbox to verify consent, this is merely illusory, as failure to consent would amount to automatic exclusion from a host of benefits and services. Despite the fact that the Act mandates ‘enrolling agencies’ (discussed later) to explain the purpose of collecting demographic and biometric information at the time of enrolment, there is no legal obligation to inform residents of the extent of information being held about them. Aggregation of information within the CIDR as a result of a series of authentication requests over a long duration of time comes perilously close to creating a complete personality profile of every resident. This makes the state privy to a wide range of activities from buying an LPG cylinder to enrolling in a school, thereby drastically altering the individual-state power dynamic.

The Act further dilutes individual agency by creating statutory exceptions to how personal information can be used. Section thirty-three of the Act allows disclosure of personal information if a court (a District Judge or above) deems fit or if it is in the interest of “national security”. Both these caveats are problematic. To illustrate the first, in 2014, the CBI approached the Bombay High Court asking the UIDAI to run a fingerprint match on its database in order to enable it to identify culprits in a criminal investigation. Before the Court, the UIDAI had argued against sharing its data owing to privacy concerns. However, the Court felt differently. The Bombay High Court directed the Central Forensic and Scientific Laboratory to appoint an expert to determine if the Aadhaar database was technologically capable of matching fingerprints. This order has been stayed by the Supreme Court but the case is yet to be disposed off. The information shared with UIDAI was never envisaged to be used in criminal investigations. However, the Act explicitly allows information to be shared if a court allows it. As per UIDAI’s own statement, the system has a False Positive Identification Rate of 0.057 per cent. When applied to all residents within the country, a fingerprint search would have the effect of putting lakhs of residents under scrutiny.

Secondly, not only has the phrase “national security” not been defined in the Act (or in any other legal text for that matter), it would be the Executive’s sole prerogative to determine whether a situation qualifies for the exception. In both these situations, the individual whose information is actually at stake need not be consulted before her information is disclosed. These two exceptions are couched so broadly, that it is almost farcical to say that personal information will be used only for the authentication of one’s identity.

The Act contains broad exceptions to how personal information can be used and does not provide for any mechanism to opt-out or have one’s data deleted from the CIDR. In doing this, it diminishes one’s agency to consent, revoke consent and control how this information can be used. A society where individuals are unable to anticipate and predict the amount of information held about them and how it may be used is likely to have a chilling impact on democratic participation.

Dubious Collection and Storage of Personal Information

The issue of consent aside, organizational mechanisms in place to collect and store personal information of over a billion residents also give rise to multiple concerns. Prior to the passing of the Act, the UIDAI had outsourced the process for enrolment to various private entities which possessed the requisite know-how. Sensitive personal information such as biometrics has thus been captured, stored and retained by private companies using their own technology and without any oversight by government officials. In 2014, news reports of ‘Lord Hanuman’ having an Aadhaar card surfaced. Vijay Madan, the then Director General of the UIDAI later explained in a statement that this was ‘not a security issue’ but an instance of ‘malpractice’. The agency was then reportedly removed from the UIDAI panel. The Act has not only given legislative sanction to the practice of private companies collecting personal information, but also does not present the affected individual with any recourse in case of a breach.

Once the data is transferred to the UIDAI, it is maintained by it in the form of the CIDR. The perils of centralized storage of personal information have long been acknowledged. Any unauthorized security breach could jeopardize the information of all residents at once. This is vastly different from a smart-card system or Apple’s Touch ID, which stores biometric information locally on the device. Under European data protection jurisprudence, storage is an important element to ascertain whether the means used are proportionate to the aim sought to be achieved by the law. If the purpose of the system is only to authenticate identity in order to plug leakages in the distribution system, the need for centralized storage must be questioned.

Aadhaar has the potential to irreversibly alter the relationship between the government and people. As the world’s second most populous country, the desire to make the distribution system more efficient is an important goal to strive for. But in this case, the trade-off between privacy and efficiency is not only undesirable but also unnecessary. Finally, the manner in which the Act was passed and the government’s submissions before the Supreme Court display a lack of good faith that only add to the already long list of concerns associated with the project.

 

SC hears the Aadhaar #NotAMoneyBill Challenge

A Supreme Court bench of the Chief Justice and Justices R. Banumathi and UU Lalit took up a petition by Mr. Jairam Ramesh, Member of Parliament (Rajya Sabha) challenging the certification of the Aadhaar Act as a money bill by the Lok Sabha Speaker today.

Senior Advocates Mr. P. Chidambaram, Mr. Kapil Sibal and Mr. Mohan Parasaran represented the petitioner and the Attorney General and Additional Solicitor General Ms. Pinky Anand represented the Government.

Mr. Chidambaram stated that the Aadhaar Bill is not a money bill, as it does not meet the criteria laid down in Article 110 of the Constitution. The bench inquired whether the question of certification is open to judicial review? Mr. Chidambaram stated that it is their stand that it is open for review whereas the AG stated that it was not open for judicial review.

The AG also raised an objection to the petitioner filing the petition under Article 32 of the Constitution. Mr. Chidambaram stated that rule of law is a fundamental right and if that is violated by the Parliament a cause of action arises. He also stated that there are judgments of the Court, which state that if there is a substantial question of constitutional law arising in a particular case a person can come to the Court under Article 32.

The AG reiterated his objection on the locus and the Chief Justice asked if the AG was saying that the rule of law is not a fundamental right? The AG stated that the rule of law is a fundamental right, however, the definition of rule of law is too broad and cases are admitted for violation of rule of law there will not be any difference between the remedies provided by Articles 32 and Article 226 of the Constitution. He stated that a matter relating to seniority of a person may involve a question of rule of law but the person cannot approach the court under Article 32.

Mr. Chidambaram stated that equating the current case to a case of seniority would be making a caricature of the argument. He stated that both the houses of the Parliament have equal status and power and in this case the decision of the presiding officer of one house has deprived the other house of its powers.

He added that in the present case the presiding officer of the Lok Sabha violated the basic rule of law and this is too grave a matter to be rejected on the argument of locus. He stated that there is a clear violation of Article 14 among others, which contains the rule of law.

On the question of judicial review, the AG stated that if a bill is certified by the Speaker as money bill that decision cannot be examined. In response, Mr. Chidambaram argued that immunity extends only to matters of procedural irregularity and not an illegality citing the Raja Ram Pal case of 2007. (For a detailed analysis of why the Supreme Court has the power to judicially review the Speaker’s decision to classify the AADHAR Bill as a Money Bill please see: Aadhaar Act as a Money Bill – Judicial Review of Speaker’s Determination Concerning Money Bills)

The Court asked both the parties to submit a list of relevant cases and listed the matter for 20th July.

Aadhaar Act as a Money Bill — Judicial Review of Speaker’s Determination Concerning Money Bills

By Dr. Anup Surendranath

The primary hurdle in determining the constitutional validity of introducing the Aadhaar (Targeted Delivery of Financial and Other Subsidies, Benefits and Services) Act, 2016, (hereinafter, “the Aadhaar Act”) as a money bill is whether courts can judicially review the decision of the Lok Sabha Speaker that it was a money bill. If the Supreme Court does not have the power to judicially review the decision of the Speaker in this context, then it is irrelevant whether the Aadhaar Act is in substantive compliance with Article 110. Without such a power for the Supreme Court, the Speaker’s determination that the Bill before her was a Money Bill would be final and binding without any scope for legitimate judicial review.

This note, however, argues that the Supreme Court has the power to judicially review the Speaker’s decision in this regard and it is essential to lay down that foundation before the court examines the substantive compliance with Article 110 (the non-compliance with the terms of Article 110(1) is relatively an easier case to make).

The entire note can be accessed here: http://ccgdelhi.org/doc/(CCG-NLU)%20Aadhaar%20Money%20Bill.pdf

The New Data Protection Regulation and its Impact on India

Written By Joshita Pai

The European Parliament  adopted  the new Rules on Data Protection on the 14th of April, 2016. The new Regulation replaces the General Rules on Data Protection, 1995 and the 2008 framework decision on cross-border data processing in police and judicial cooperation within the EU. In January 2012, the EU Commission first presented a package of proposals in order to update and modernize the present EU legal framework which was accepted subsequently by the Council in December 2015. The new data protection package consists of a general regulation on personal data processing in the EU and a directive on data processed by the police and judicial authorities.

Highlights of the Regulation

The regulation, establishes a stronger regime for protection of personal data by giving more control to the users in the digital market. It enshrines provisions on the much awaited right to be forgotten in the virtual space,[i] provisions  on the need for clear and affirmative consent and the right of an individual to be informed. Profiling of an individual by collecting a person’s data is often presented in the name of customized service and commercial interest of the company. The new regulation allows for a right to object against profiling unless it is necessary for legal enforcement purposes or for scientific research. The Directive also envisages provisions on data portability which will enable users to shift from one service provider to another, without losing the data accumulated in the use of the former.      Aside from vesting a bundle of rights in the hands of the users, the regulation makes way for an array of provisions for companies to abide by. The crucial provisions affecting business companies include:

  1. Sanctions on companies that breach data transfer of upto 4% of annual profits: This provision in the regulation holds heavy bearing since its application extends to companies established outside the European Union. organisations will additionally be required to carry out data protection impact assessmentswhere their plans to process personal data are “likely to result in a high risk for the rights and freedoms of individuals”.
  2. Provision for appointing a data protection officer if the company engages in processing of sensitive data: For businesses in which the “core activities” consist of processing operations that “by virtue of their nature, their scope and/or their purposes, require regular and systematic monitoring of data subjects on a large scale”; or if it involves processing sensitive data on a large scale, the new Directive recommends the mandatory appointment of a DPO.
  3. The introduction of the new one-stop-shop concept in the Regulation: The Regulation states there will be a single supervisory authority who will be engaging with business houses, instead of one authority in each member state. The ‘one-stop-shop’ will streamline cooperation between the data protection authorities on issues with implications for all of Europe.

The Impact of the new EU Regulation on India

The cross-border flow of data from the EU states to other nations has been contentious, visibly so after the Schrems decision which rendered the EU-US safe harbour provision inadequate. The decision called for a new set of guidelines which resulted in the creation of the EU-US privacy shield.

The EU framework of 1995 as well as the enhanced edition of the Regulation, prescribes a mandatory adequacy decision to determine whether the country in question adequately protects personal data. The new Regulation, dedicates a chapter on transfer of personal data to third party countries, and India’s interest in the Directive lies here. It provides that:

“A transfer of personal data to a third country or an international organization may take place where the Commission has decided that the third country, or a territory or one or more specified sectors within that third country, or the international organization in question ensures an adequate level of protection.”

The European Commission in 2015 produced a report on Data Protection in India to assess the measures and standards adopted for protection of data in India. The report highlighted the lacunae in Indian laws pertaining to personal data. According to a recent survey by NASSCOM-DSCI, there is an opportunity loss of USD 2.0 billion – 2.5 billion owing to data transfer related issues. The report notes that EU clients are hesitant to offshore work to Indian companies because of the dearth of data protection standards in India. With particular regard to data protection, institutionalizing a regulatory regime in India has become a herculean task with no comprehensive legislation on data protection in force. Statutory attempts to this effect have either been dissipated across the arena or have not been effectively executed so far. The penalty of a 4% of annual turnover of a company on account of data breach is one of the outstanding features of the new Regulation and pitching this against the backdrop of a staggered regime on data protection in India indicates a host of repercussions.

Joshita Pai was a Fellow at the Centre for Communication Governance  (2015-2016) 

[i] ‘The right to be forgotten’ stirred up as a concept after a Spanish national sued Google Spain and a Spanish newspaper for retaining information about him that was published several years ago.