Guest Post: The Data Producer’s Right and its Limitations

This guest post was authored by Ishita Khanna.

Information privacy theorists have argued that data is ‘quasi currency’ in the age of information technology. Though the economic value of data incentivizes companies to collect data, big data also possesses the potential to increase productivity, improve governance and thus benefit consumers and citizens. Data-driven insights can extract significant value by analysing it for purposes such as cost savings, enhanced procedures, a better knowledge of behaviour, and highly tailored products.

In this data driven economy, the data generated and collected by machines and human beings possesses tremendous value. Machine generated data is said to be data which is created through the use of computer applications, processes and services or through sensors which process information which they get from software, equipment or machinery, whether real or virtual. An interesting example of machine generated data is provided by the automobile industry. Sensors installed on cars generate data with respect to traffic prediction, location based searches, safety warnings, autonomous driving, and entertainment services. An analysis of this data can result in monetizable insights in the form of revelations for vehicle designs or selling access to the insurance industry. In this way, data precedes information, which precedes knowledge, which precedes understanding. Thus, raw data in crucial in the generation of value.

It was in response to this phenomenon that the European Commission in 2017 had proposed the creation of a ‘data producer’s right’ (DPR) which would protect anonymized, non-personal machine generated industrial data against the world i.e., a novel property right in data. This would create in favour of the data producer, ‘a right to use and authorize the use of machine generated data’. One of the other dominant reasons inspiring the call for creation of a novel property right in data stems from the fear that American companies are misappropriating valuable European assets. The use of European news by Google led to an initiative for a neighbouring rights for news publishers in the EU which furthered the call for a data producer’s right. For instance, the introduction of the sui generis database producer’s right in Europe in 1996 was borne out of the fear of domination by the US database industry over Europe’s markets.

This post seeks to critically examines the background, stated aims, subject matter and scope of the data producer’s right. It studies the inter-relationship between the existing intellectual property regimes and the property right in data to analyse if and how this new right would affect these regimes. Towards the end it offers recommendations for the alternative models that could be adopted for the protection of non-personal data.

Do we need a new IP right for machine generated data?

It has been contended that the existing IPR regimes as well as civil law, contract law and trade secret protection do not offer requisite protection to machine generated non-personal data since they do not create an ex-ante right in rem, hence raw data would not be protected from misappropriation by third parties and a market for licensing of data would not emerge. Copyright law only protects acts of authorship or compilations of data that are a consequence of creative arrangement or selection. Further, the sui generis database right only extends to data structured in a database. Hence, the argument for the introduction of a right in machine generated raw data.

The EU- DPR was envisaged as a novel type of intellectual property right, as the means to an end: to make data accessible. However, building new property fences seems paradoxical to the idea of increasing access to data. The answer to this lies in recognition of the fact that ‘property is an institution for organising the use of resources in society’. The stable legal entitlements that come along with a property right incentivizes the development of a valuable resource through consolidating both risks and benefits in right-holders and also stimulates the use and trade of data. The DPR was conceived of as a right in rem i.e. ‘enforceable against the world independent of contractual relations’ including the exclusive right in the data producer to use certain types of data and license their usage, thus embodying the essential features of the right of ownership of property. The hope was that infusing machine generated non-personal data with property rights, it would lead to the creation of a stable and safe licensing marketplace for the data.

Why is data such a challenging subject for IP Law?

However, the DPR would extensively overlap with copyright and sui generis database rights in production made using digital machines which could give lead to numerous competing ownership claims. For instance, the aggregate stock market data in a financial database would be the subject matter of protection of both the data producer’s right and the sui generis database right. Further, DPR could trump the statutory limitations laid down under the existing IPR regimes and the database right thus limiting their scope of protection. At present, users in the European Union are allowed to copy data from databases for the purpose of non-commercial research. The DPR would infringe on such freedoms allowed to the users unless it includes within its ambit all such relevant exceptions.

Another objection against a property right in data lies in its inherent lack of legal certainty and stability with respect to its scope, subject matter, and ownership, essential for it to be considered as a full-fledged IP right enforceable against the world. A property right in data would severely infringe on the freedom of expression and information by curtailing access to data to scientists, research institutions and journalists with respect to text and data mining. This freedom has been acknowledged in Article 13 of the EU Charter which stresses on the free flow of data in arts, scientific research and academic freedom.

Thus, a data producer’s right would encroach upon the central tenet of the IPR system which regards data as ‘free air for common use’ and only offers protection to creative and innovative inventions. The dynamic and fluid nature of raw data makes it difficult to classify as subject matter of a full-fledged intellectual property right. The database right raised a similar objection. However, the definition of ‘database’ and the requirement of a certain threshold of investment created at least some stability in the scope and subject matter of the right, unlike the DPR.

It is also important to understand why the property logic for data protection failed. The lack of success of the closely analogous, sui generis database right in promoting investment in and incentivizing the formulation of databases in the EU database industry is one of the reasons. Another reason is said to be attributed to the inclination towards opening data or making it accessible for both commercial and non-commercial re-use thus doing away with the exclusivity requirement. Hence, currently there exist no potent economic justifications for creation of a DPR. Instead, data producers can protect their data using the contract law, trade secret law and technology law protection mechanisms.

Thus, it can be concluded that a novel IP right should only be introduced after thorough economic-evidence based research establishing the real requirement for the right and not spontaneously. However, this alone will not suffice and must be accompanied with a methodical legal analysis of the scope and subject matter of the new right as well as its inter-relationship with the existing IPR regime.

Examining ‘Deemed Consent’ for Credit-Scoring under India’s Draft Data Protection Law

By Shobhit Shukla

On November 22, 2022, the Ministry of Electronics and Information Technology released India’s draft data protection law, the Digital Personal Data Protection Bill, 2022 (‘Bill’).* The Bill sets out certain situations in which seeking an individual’s consent for processing of their personal data is “impracticable or inadvisable due to pressing concerns”. In such situations, the individual’s consent is assumed; further, they are not required to be notified of such processing. One such situation is for processing in ‘public interest’. The Bill also illustrates certain public-interest purposes and notably, includes ‘credit-scoring’ as a purpose, in Clause 8(8)(d). Put simply, the Bill allows an individual’s personal data to be processed non-consensually and without any notice to them, where such processing is for credit-scoring.

Evolution of credit-scoring in India

Credit-scoring is a process by which a lender (or its agent) assesses an individual’s creditworthiness i.e., their notional capacity to repay their prospective debt, as represented by a numerical credit score. Until recently, lenders in India relied largely on credit scores generated by credit information companies (‘CICs’), licensed by the Reserve Bank of India (‘RBI’) under the Credit Information Companies (Regulation) Act, 2005 (‘CIC Act’). CICs collect and process ‘credit information’, as defined under the CIC Act, to generate such scores. Such information, for an individual, comprises chiefly of the details of their outstanding loans and history of repayment/defaults. However, with the expansion of digital footprints and advancements in automated processing, the range of datasets deployed to generate credit scores has expanded significantly. Lenders are increasingly using credit scores generated algorithmically by third-party service-providers. Such agents aggregate and process a wide variety of alternative datasets relating to an individual, alongside credit information – these may include the individual’s employment history, social media activity, and web browsing history. This allows them to build a highly data-intensive credit profile of (and assign a more granular credit score to) the individual, to assist lenders in deciding whether to extend credit. Not only does this enable lenders to make notionally better-informed decisions, but also to assess and extend credit to individuals with meagre or no prior access to formal credit.

While neither the Bill nor its explanatory note explain why credit-scoring constitutes a public-interest ground for non-consensual processing, it may be viewed as an attempt to remove the procedural burden associated with notice-and-consent. In the context of credit-scoring, if lenders (or their agents) are required to provide notice and seek consent at each instance to process the numerous streams of an individual’s personal data, the procedural costs may disincentivise them from accessing certain data-streams. Consequently, with limited data to assess credit-risk, lenders may adopt a risk-averse approach and avoid extending credit to certain sections of individuals. Alternatively, they may decide to extend credit despite the supposed inadequacy of personal data, thereby exposing themselves to higher risk of repayment defaults. While the former approach would be inimical to financial inclusion, the latter could possibly result in accumulation of bad loans on lenders’ balance sheets. Thus, encouraging data-intensive credit-scoring (for better-informed credit-decisions and/or for widening access to credit) may conceivably be viewed as a legitimate public interest.

However, in this post, I contend that even if this were to be accepted, a complete exemption from notice-and-consent for credit-scoring, poses a disproportionate risk to individuals’ right to privacy and data protection. The efficacy of notice-and-consent in enhancing informational autonomy remains debatable; however, a complete exemption from the requirement, without any accompanying safeguards, ignores specific concerns associated with credit-scoring.

Deemed consent for credit-scoring: Understanding the risks

First, the provision allows non-consensual processing of all forms of personal data, regardless of any correlation of such data with creditworthiness. In effect, this would encourage lenders to leverage the widest possible range of personal datasets. As research has demonstrated, the deployment of disparate datasets increases incidences of inaccuracy as well as of spurious connections between the data-input and the output. In credit-scoring, historical data using which the underlying algorithm is trained may conclude, for instance, that borrowers from a certain social background are likelier to default in repayment. Credit-scores generated from such fallacious and/or unverifiable conclusions can embed systemic disadvantages into future credit-decisions and deepen the exclusion of vulnerable groups. The exemption from notice-and-consent would only increase the likelihood of such exclusion – this is since individuals would not have any knowledge of the data-inputs used, or the algorithm using which such data-inputs were processed and consequently, no recourse against any credit-decisions arrived at via such processing.

Second, the provision allows any entity to non-consensually process personal data for credit-scoring. Notably, CICs are specifically licensed by the RBI to, inter alia, undertake credit-scoring. Additionally, in November 2021, the RBI amended the Credit Information Companies Regulations, 2006, to provide an avenue for entities (other than CICs) to register with any CIC, subject to the fulfilment of certain eligibility criteria, and to consequently access and process credit information for lenders. By allowing any entity to process personal data (including credit information) for credit-scoring, the Bill appears to undercut the RBI’s attempt to limit the processing of credit information to entities under its purview.

Third, the provision allows non-consensual processing of personal data for credit-scoring at any instance. A plain reading suggests that such processing may be undertaken even before the individual has expressed any intention to avail credit. Effectively, this would provide entities a free rein to pre-emptively mine troves of an individual’s personal data. Such data could then be processed for profiling the individual and behaviourally targeting them with customised advertisements for credit products. Clearly, such targeted advertising, without any intimation to the individual and without any opt-out, would militate against the individual’s right to informational self-determination. Further, as an RBI-constituted Working Group has noted, targeted advertising of credit products can promote irresponsible borrowing by individuals, leading them to debt entrapment. At scale, predatory lending enabled by targeted advertisements could perpetuate unsustainable credit and pose concerns to economic stability.

Alternatives for stronger privacy-protection in credit-scoring

The above arguments demonstrate that the complete exemption from notice-and-consent for processing of personal data for credit-scoring, threatens individual rights disproportionately. Moreover, the exemption may undermine precisely the same objectives that policymakers may be attempting to fulfil via the exemption. Thus, Clause 8(8)(d) of the Bill requires serious reconsideration.

First, I contend that Clause 8(8)(d) may be deleted before the Bill is enacted into law. In view of the CIC Act, CICs and other entities authorised by the RBI under the CIC Act shall, notwithstanding the deletion of the provision, continue to be able to access and process credit information relating to individual without their consent – such processing shall remain subject to the safeguards contained in the CIC Act, including the right of the individual to obtain a copy of such credit information from the lender.

Alternatively, the provision may be suitably modified to limit the exemption from notice-and-consent to certain forms of personal data. Such personal data may be limited to ‘credit information’ (as defined under the CIC Act) or ‘financial data’ (as may be defined in the Bill before its enactment) – resultantly, the processing of such data for credit-scoring would not require compliance with notice-and-consent. The non-consensual processing of such forms of  data (as opposed to all personal data), which carry logically intuitive correlations with creditworthiness, shall arguably correspond more closely to the individual’s reasonable expectations in the context of credit-scoring. An appropriate delineation of this nature would provide transparency in processing and also minimise the scope of fallacious and/or discriminatory correlations between data-inputs and creditworthiness.

Finally, as a third alternative, Clause 8(8)(d) may be modified to empower a specialised regulatory authority to notify credit-scoring as a purpose for non-consensual processing of data, but within certain limitations. Such limitations could relate to the processing of certain forms of personal data (as suggested above) and/or to certain kinds of entities specifically authorised to undertake such processing. This position would resemble proposals under previous versions of India’s draft data protection law, i.e. the Personal Data Protection Bill, 2019 and the Personal Data Protection Bill, 2018 – both draft legislations required any exemption from notice-and-consent to be notified by regulations. Further, such notification was required to be preceded by a consideration of, inter alia, individuals’ reasonable expectations in the context of the processing. In addition to this balancing exercise, the Bill may be modified to require the regulatory authority to consult with the RBI, before notifying any exemption for credit-scoring. Such consultation would facilitate harmonisation between data protection law and sectoral regulation surrounding financial data.

*For our complete comments on the Digital Personal Data Protection Bill, 2022, please click here – https://bit.ly/3WBdzXg) 

Censoring the Critics: The Need to Balance the Right to Erasure and Freedom of Speech

Clause 13(2)(d) of the Digital Data Protection Bill, 2022 (“DPDP Bill”) provides for the right to erasure of personal data i.e. “…any data about an individual who is identifiable by or in relation to such data”. The said clause states that a data principal has the right to erasure of personal data as per applicable laws and as prescribed. The clause further provides that such erasure of personal data shall take place after the data fiduciary receives a request for erasure. The precondition for erasure is that the personal data must no longer be necessary for the purpose for which it was processed and that it must not be necessary for any legal purpose either. 

This is in many ways a salutary provision. Data principals should have control over their data which includes the right to correct and erase data. This is especially important since it protects individuals from the negative impacts of the widespread availability of personal data on the internet. In today’s digital age, it is easier than ever for personal data to be collected, shared, and used in ways that are harmful or damaging to individuals. The right to erasure aids in countering these negative impacts by giving individuals the power to control their own personal information, and to have it removed from the internet if they choose to do so.

However, this provision can negatively impact several other fundamental rights such as the freedom of speech and right to information, especially when it is abused by powerful figures to silence criticism. For example, if an investigative journalist were to write an article in which they bring to light a government official’s corrupt deeds, the said official would be able to request the data fiduciary to erase such data since they are identifiable by it or are related to it. 

This article will seek to address such concerns in two ways. First, it will delve into the safeguards that can be included in the text of Clause 13(2)(d) to ensure that there is an appropriate balance between free speech and privacy. Second, it will recommend that the arbiter of this balance should be an independent authority and not data fiduciaries. 

(1) Safeguards 

Clause 13(2)(d) is heavily tilted in favor of the privacy interests of the data principal. It does not require data fiduciaries to take into account any other considerations that might have a bearing on the data principal’s erasure request. In order to prevent privacy interests from undermining other rights, the clause should be amended to include various safeguards. 

In particular, the clause should require data fiduciaries to consider the free speech rights of other individuals who might be affected by an erasure request. As indicated earlier, journalists may find it difficult to publish critical commentary on powerful public figures if their work is subject to easy erasure. There are also artistic, literary and research purposes for which personal data might be used by other individuals. These are valid uses of personal data that should not be negated simply because of an erasure request. 

Data fiduciaries can also be made to consider the following factors through subordinate legislation to harmonize free speech and privacy: (a) the role of the data principal in public life, (b) the sensitivity of the personal data sought to be erased, (c) purpose of processing, (d) public nature of data and (e) relevance of the personal data to the public. Incorporating such safeguards will help ensure that data fiduciaries appropriately balance the right to privacy and the right to speech when they receive erasure requests.

Further, a clearly laid out process for grievance redressal should also be codified. Currently, Clause 13(2)(d) does not provide for an appeal mechanism for erasure requests that have been rejected by data fiduciaries. The clause should explicitly provide that in case the data principal wants to contest the rejection of their erasure request, they can file a complaint with the Data Protection Board (DPB). 

(2) Independent Authority 

In addition to lacking sufficient safeguards, Clause 13(2)(d) puts the onus on data fiduciaries to decide the validity of erasure requests. Various jurisdictions including the United Kingdom and Spain along with other states from the European Union use this framework. However, giving decision making power directly to Data Fiduciaries will have a chilling effect on speech.

This is because they will tend to mechanically comply with erasure requests in order to escape liability for non-compliance. Data fiduciaries lack the bandwidth needed to properly assess the validity of erasure claims. They are for the most part private businesses with no obligation or commitment to uphold the rights and freedoms of citizens, especially if doing so will entail the expenditure of significant resources.

Consequently, there is a need for a different framework. Clause 13(2)(d) should be amended to provide for the creation of an independent authority which will decide the validity of erasure requests. Such a body should be staffed with free speech and privacy experts who have the incentive and the capability to balance competing privacy and speech considerations. 

Conclusion 

We can see from the discussion above that the right to erasure provision of the Digital Data Protection Bill, 2022 has failed to strike a sound balance between privacy and free speech. To achieve such a balance, Clause 13(2)(d) should be amended to incorporate various safeguards. Furthermore, an independent authority should be deciding the validity of erasure requests, not data fiduciaries.

Comparative tracker for the Digital Personal Data Protection Bill, 2022

The Digital Personal Data Protection Bill, 2022 (“2022 Bill”) was released by the Ministry of Electronics and Information Technology on November 18, 2022, with the stated intent of being concise, comprehensible, and simplified for the citizens. For these reasons, the 2022 Bill has made significant changes to the framework of the earlier Personal Data Protection Bill, 2019 (“2019 Bill”), which was withdrawn earlier this August during the Monsoon session of the Parliament.

We have prepared this detailed tracker to record the changes made in the 2022 Bill, and compared the differences in the key provisions of the 2022 Bill and the 2019 Bill. This tracker can be a helpful reference while analysing the two Bills, or even a quick guide to the changes brought out in the 2022 Bill. 

This tracker has used the 2019 Bill as reference for the changes, as this was the last version of the Data Protection Bill which was introduced before the Parliament as a comprehensive legislation. We have analysed each clause and sub-clause of the 2022 Bill and compared it to the corresponding provisions of the 2019 Bill. We have provided the full text of the provisions (highlighting the differences) as well as a brief summary of changes under the 2022 Bill. Readers may use the 2022 Bill as the base, when looking for the changes made to specific provisions of the 2019 Bill.

As the public and expert analyses and opinions on the 2022 Bill are still being developed, we invite comments on any errors or omissions of corresponding provisions which may be present in this tracker.

The tracker can be accessed here.

Please note: a detailed list of the removed provisions from the 2019 Bill will be added to the tracker in the next few days.

(Compiled by Tejaswita Kharel and Vignesh Shanmugam)

(Update January 24, 2023: We have included in the tracker a detailed list of provisions removed from the 2022 Bill along with our comments on the same)

CCG’s Comments to the Ministry of Electronics and Information Technology on the Draft National Data Governance Framework Policy

Authors: Joanne D’Cunha and Bilal Mohamed

On 26th May 2022, the Ministry of Electronics and Information Technology (MeitY), released the Draft National Data Governance Framework Policy (NDG Policy) for feedback and public comments. CCG submitted its comments on the NDG Policy, highlighting its feedback and key concerns with the proposed Data Governance Framework. The comments were authored by Joanne D’Cunha and Bilal Mohamed, and reviewed and edited by Jhalak M. Kakkar and Shashank Mohan.

The draft National Data Governance Framework Policy is a successor to the draft ‘India Data Accessibility and Use’ Policy, which was circulated in February 2022 for public comments and feedback. Among other objectives, the NDG policy aims to “enhance access, quality, and use of data to enable a data-led governance” and “catalyze AI and Data led research and start-up ecosystem”.

“Mountain” by Mariah Jochai is licensed under CC BY 4.0

CCG’s comments to the MeitY are divided into five parts – 

In Part I, of the comments we foreground our concerns by emphasising the need for comprehensive data protection legislation to safeguard citizens from potential privacy risks before implementing a policy around non-personal data governance. 

In Part II, we focus on the NDG Policy’s objectives, scope, and key terminologies. We highlight that the NDG Policy lacks in  sufficiently defining key terms and phrases such as non personal data, anonymisation, data usage rights, Open Data Portal, Chief Data Officers (CDOs), datasets ecosystem, and ownership of data. Having clear definitions will bring in much needed clarity and help stakeholders appreciate the objectives and implications of the policy. This also improves  engagement from the stakeholders including the government in the policy consultation process. This also enhances engagement from the stakeholders, including the various government departments, in the policy consultation process.  We also highlight that the policy does not illustrate how it will intersect and interact with other proposed data governance frameworks such as the Data Protection Bill 2021 and the Non Personal Data Governance Framework. We express our concerns around the NDG Policy’s objective of cataloguing datasets for increased processing and sharing of data matching with the aim to deploy AI more efficiently.  It relies on creating a repository of data to further analytics, and AI and data led research. However, it does not take into consideration that increasing access to data might not be as beneficial if computational powers of the relevant technologies are inadequate. Therefore, it may be more useful if greater focus is placed on developing computing abilities as opposed to increasing the quantum of data used.

In Part III, we focus on the privacy risks, highlighting concerns around the development and formulation of anonymisation standards given the threat of re-identification from the linkage of different datasets. This, we argue, can pose significant risks to individual privacy, especially in the absence of a data protection legislation that can provide safeguards and recognise individual rights over personal data. In addition to individual privacy harms, we also point to the potential for collective harms from using aggregated data. To this end, we suggest the creation of frameworks that can keep up with the increased risks of reidentification posed by new and emerging technologies.

Part IV of our comments explores the institutional framework and regulatory structure of the proposed India Data Management Office. The proposed IDMO is responsible for framing, managing, reviewing, and revising the NDG Policy. Key concerns on the IDMO’s functioning pertain to the exclusion of technical experts and representatives of civil society and industry in the IDMO. There is also ambiguity on the technical expertise required for Chief Digital Officers of the Digital Management Units of government departments and ministries, and the implementation of the redressal mechanism. In this section, we also highlight the need for a framework within the Policy to define how user charges will be determined for data access. This is particularly relevant to ensure that access to datasets is not skewed and is available to all for the public good. 

You can read our full submission to the ministry here.

Cybersecurity and Trade: Understanding Linkages for the Global South

Sukanya Thapliyal*

  1. BACKGROUND: 

Cybersecurity concerns are increasingly creeping into the international trade arena. Emerging technologies such as Big Data, Artificial Intelligence (AI), Internet of things (IoT), among others, have led to the digitalisation of the economy and society and has transformed our day-to-day lives. In addition, the COVID-19 pandemic has further accelerated the digitalisation process. As a result, countries, businesses and individuals worldwide are embracing this shift and are becoming increasingly reliant on digital technologies. The digital economy has significantly contributed to the increase in services trade, reduced trade costs, and increased participation of micro, small and medium enterprises (MSMEs) within international trade. The shift towards the digital economy has also empowered enterprises in amassing and analysing massive amounts of data. This helps businesses or organisations improve their operations and develop better products and services for existing and prospective consumers. 

However, ensuing interconnectivity and reliance on digital technologies exposes society/economies to several risks. These include threats of cyberattacks such as ransomware, political espionage, economic espionage, identity theft, and intellectual property theft.  These threats impact national defence authorities, critical infrastructures, commercial enterprises, and enforcement agencies alike. Such threats can emerge from both State and Non-State actors. However, countries vary greatly in their ability to understand and address these challenges. A recent study by Kaspersky Labs has identified Asia-Pacific Countries (APAC) as among the most prominent targets of cyberattacks owing to their rapidly increasing usage of digital technologies coupled with lack of awareness regarding cybersecurity, and limited resources deployed towards mitigation. India features among the top five countries most prone to cyberattacks along with China and Pakistan.

This piece seeks to map the dominant discourse on Cyber Security and International Trade. First, it examines the current World Trade Organization (WTO) framework and selects certain Free Trade Agreements (FTAs) to understand how cybersecurity concerns are presently understood only as related to national security or potential non-tariff barriers (NTB). Rooted in the fact that cybersecurity is inextricably linked to the technical capacity of a Member State to identify vulnerabilities, it argues that there is an urgent need to repurpose cybersecurity as an issue within the capacity building and technology transfer discussions.

image by geralt. Licensed via CC0.
  1. CYBERSECURITY ISSUES UNDER WORLD TRADE ORGANIZATION (WTO)

Despite rising cybersecurity concerns, international trade rules have minimal engagement in this area. Prominent international trade organisations (such as WTO) and other legal instruments like Free Trade Agreements (FTAs) have primarily focused on setting rules for digital commerce and have addressed cybersecurity as an incidental and secondary issue.  Within WTO’s existing framework, cybersecurity issues do not fall within a single set of rules.1 Depending on the context and subject of the dispute, several WTO Agreements, including General Agreement on Tariffs and Trade in Goods (GATT), General Agreement on Trade in Services (GATS), Agreement on Trade-Related Aspects of Intellectual Property Rights (TRIPS) and WTO Agreement on Technical Barriers to Trade (TBT Agreement), can have some bearing on the result of the dispute. As a result, the emerging cybersecurity issues can only be understood and interpreted on a case-by-case basis.2 

Currently, countries impose cybersecurity measures that range from complete prohibition on the trade of goods or services, tariff and non-tariff barriers, imposition of certification requirements and imposition of domestic standards, among others. Although none of these cybersecurity measures has been challenged at the WTO’s Dispute Settlement System so far, concerns were raised against China’s imposition of cybersecurity measures on ICT products and services by the European Union, USA, Canada, Japan and Australia in 2017. In another instance, China raised concern over Australia banning Chinese companies from supplying equipment for a 5G mobile  network on the grounds of national security

Propelled by similar developments, where Member States imposed different types of cybersecurity measures (prohibition on trade in technology goods, imposition of certification requirements and domestic standards), the discourse on cybersecurity and trade primarily focused on the cybersecurity measures as potential non-tariff barriers. As WTO primarily focuses on strengthening economic cooperation and reducing or eliminating trade barriers (tariff and non-tariff), the primary discourse has been centered only around these concerns. Numerous studies have identified the need to distinguish between genuine domestic cybersecurity policy measures taken by the Member States from those that are merely disguised protectionism or purely political in nature. 

Scholars also highlighted that Member States might justify such actions based on national security exceptions articulated under the GATT (Article XXI), GATS (Article XIV bis), TRIPS (Article 73) and other WTO Agreements. The national security exception, as broadly understood, allows Member States to take measures as they consider necessary for the protection of their essential security interests. This is problematic from several perspectives. 

The security exception was long touted as a self-judging provision and outside the purview of judicial review of the Dispute Settlement Body (DSB). This understanding was substantially modified in the context of GATT’s security exception in Russia – Traffic in Transit by the WTO Panel Report in 2019. The Panel opined that Article XXI (b) is not totally self-judging and that the term “essential security interests” are restricted to specific scenarios related to military facilities, nuclear facilities and measures taken in time of “war” or “other emergency in international relations”. Further, the Panel also emphasised that such a measure must be invoked in “good faith”. While Russia – Traffic in Transit Panel Report does provide a straightforward interpretation of the scope of the provision, several scholars, including Sarah Alturki and Neha Mishra have examined the security exceptions laid down under GATT and GATS as problematic in addressing cybersecurity measures. They maintained that the existing security exceptions under the WTO framework provisions are dated and were not conceived to cover cyber conflicts. Although the DSB may undertake to read such provisions in an evolutionary manner, the ambiguous nature of cyber-threats coupled with the lack of international consensus on cybersecurity governance makes it extremely challenging to resolve cybersecurity-related disputes. 

  1. CYBERSECURITY PROVISIONS UNDER FREE TRADE AGREEMENTS (FTAs)

Besides security exceptions under the WTO framework, some Free Trade Agreements, in their digital trade/e-commerce chapters, have dedicated provisions concerning inter-State cooperation in cybersecurity. For instance, Article 14.16 of the Comprehensive and Progressive Agreement for Trans-Pacific Partnership (CPTPP) recognises the importance of capacity building and collaborating mechanisms to identify and mitigate malicious intrusions or dissemination of malicious code that affect the electronic networks of countries which are Party to the Agreement. Article 12.13 of the Regional Comprehensive Economic Partnership (RCEP) features an identical provision. Further, Article 19.15 of the United States-Mexico-Canada Agreement (USMCA) features an expanded version of this condition. The provision obligates the Member States to share information and best practices and employ risk-based approaches that rely on consensus-based standards to detect, respond to, and recover from cybersecurity events.

To contain the misuse of cybersecurity measures that can harm free trade and economic cooperation among participating countries, several FTAs have included a provision to deter such behavior. Such provisions include the prohibition on disclosure of source code3, prohibition on the requirement to locate computing facilities in a specific jurisdiction4 and provisions mandating cross-border transfer of information by electronic means5. The measures relating to prohibition on disclosure of source code, restriction on mandating location of the computing facilities and others often find themselves in the cross-fire of a host of concerns emanating from economic development, transparency and cybersecurity. 

It is also important to note that these provisions also target policies restraining the free flow of cross-border data (data-localisation policies) prevalent in a number of countries including India, China, Vietnam, among others. 

  1.  OTHER POSSIBLE FRONTIERS FOR CYBERSECURITY AND INTERNATIONAL TRADE IN RESPECT OF GLOBAL SOUTH 

Beyond the above mentioned concerns, cybersecurity is also a question of technical competence and resources available for several developing and least-developed countries. Several studies and reports, including the recent Kaspersky projections for 2022, indicate a wide gap in countries’ ability to detect, assess and effectively respond to cyber-attacks. There has been a steep rise in the adoption of digital tools often outpacing the establishment of necessary state institutions, legal regulations and capacity to manage new challenges.  Digital solutions are seen as the gateway to economic growth and social development. These developments should not be seen in isolation from cybersecurity capacity building. The unbridled adoption of digital solutions without being secured can have far reaching implications for the economy and can lead to poor infrastructures and hollow digital development for countries in the global south. 

As mentioned above, the current provisions, under the FTAs and discussions at the WTO surrounding cybersecurity concerns for international trade, extend only up to sharing information and best-practices. Such glaring vulnerabilities can only be addressed through development assistance that includes technology transfers and offering cybersecurity capacity building and requires active cooperation from the developed countries. The discussions around digital development must be embedded in digital security. Developing countries, including India, should leverage their positions in economic forums and constructively channel the discussions around tech-transfer and technology facilitation mechanisms (TFM) on cybersecurity, as they have done in the past in the context of drug development and climate change. Existing tools for developing and least-developed countries incorporated under Article 66 and 67 of the TRIPS Agreement are insufficient, have seen weak implementation, and are unlikely to bridge this gap. As India is assuming the G20 presidency on December 1, 2022, it can lead the path for such momentous changes and offer the global south perspective the world needs.


*The author is grateful for the comments and contributions by Ms Garima Prakash, Deputy Manager, NASSCOM.

References:

  1. It is important to note that the WTO Agreements dates back to 1994 did not treat cyber issues specifically, but their rules nevertheless have application to cyber-related policies. See: Kathleen Claussen, ‘Economic cybersecurity law’ in Routledge Handbook of International Cybersecurity, pp.341-353 (Routledge, 1, 2020). See also: Dongchul Kwak, “No More Strategical Neutrality on Technological Neutrality: Technological Neutrality as a Bridge Between the Analogue Trading Regime and Digital Trade” World Trade Review (2021), 1–15.
  2. Post-2017, around 70 WTO Member States spearheaded by the USA and other developed countries have initiated “exploratory work together towards future WTO negotiations on trade-related aspects of electronic commerce.”  India and South Africa are not part of this initiative. Nevertheless, the result of these discussions shall have some bearing on the future of cybersecurity and trade.
  3.  Article 19.16 of USMCA (Similar provisions are incorporated under other trade agreements including CPTPP and RCEP).
  4. Article 19.12 of USMCA. (Similar provisions are incorporated under other trade agreements including CPTPP and RCEP).
  5. Article 19.11 of USMCA. (Similar provisions are incorporated under other trade agreements including CPTPP and RCEP).

Technology & National Security Reflection Series Paper 13: Flipping the Narrative on Data Localisation and National Security

Romit Kohli*

About the Author: The author is a fifth year student of the B.A. LL.B. (Hons.) programme at the National Law University, Delhi.

Editor’s Note: This post is part of the Reflection Series showcasing exceptional student essays from CCG-NLUD’s Seminar Course on Technology & National Security Law. This post was written in Summer, 2021. Therefore, it does not reflect recent policy developments in the field of data governance and data protection such as the December 2021 publication of the Joint Parliamentary Committee Report and its proposed Data Protection Bill, 2021.

I. Introduction

Countries all over the world are seeking to preserve and strengthen their cyber-sovereignty in various ways. One popular mechanism for the same is labelled with the nebulous phrase ‘data localisation’. Data localisation refers to requirements imposed by countries which necessitate the physical storage of data within their own national boundaries. However, the degree of data localisation varies across jurisdictions. At one end of the spectrum, we have ‘controlled localisation’ that favours the free-flow of data across borders, subject to only mild restrictions.  A prominent example of controlled localisation is the European Union’s (“EU”) General Data Protection Regulation (GDPR). At the other end of the spectrum, we have jurisdictions like China which impose much stricter localisation requirements on businesses operating within their national boundaries.

In India data localisation has become a significant policy issue over the last few years. Various government documents have urged lawmakers to introduce a robust framework for data localisation in India. The seminal policy document in this regard is the Justice BN Srikrishna Committee report, which provided the basis for the Personal Data Protection Bill of 2019.This bill proposed a framework which would result in a significant economy-wide shift in India’s data localisation practices. At the same time, various government departments have sought to implement sector-specific data localisation requirements with different levels of success.

This blog post argues that far from being a facilitator of national security, data localisation measures may present newer threats to national security in their implementation. We seek to establish this in three steps. First, we analyse the link between India’s national security concerns and the associated objectives of data localisation. This analysis demonstrates that the mainstream narrative regarding the link between national security and data localisation is inherently flawed. Thereafter, we discuss the impact of data localisation on the economic growth objective, arguing that India’s localisation mandate fails to consider certain unintended consequences of data localisation which restrict the growth of the Indian economy. Lastly, the article argues how this adverse impact on economic growth poses a threat to India’s national security, which requires us to adopt a  more holistic outlook of what constitutes national security. 

Image by World Bank Photo Collection’s Photostream. Copyrighted under CC BY 2.0.

II. The Mainstream Narrative

The Srikrishna Committee report underscores national security concerns as a basis for two distinct policy objectives supporting the introduction of data localisation measures. First, the report refers to the need for law enforcement agencies to have access to data which is held and controlled by data fiduciaries, stating that such access is essential for ‘… effectively [securing] national security and public safety…’ since it facilitates the detection of crime and the process of evidence gathering in general (Emphasis Added). However, experts argue that such an approach is ‘… unlikely to help India achieve objectives that actually require access to data’. Instead, the government’s objectives would be better-served by resorting to light-touch localisation requirements, such as mandating the storage of local copies of data in India while still allowing the data to be processed globally. They propose complementing these domestic measures with negotiations towards bilateral and multilateral frameworks for cross-border access to data.

Second, the report states that the prevention of foreign surveillance is ‘critical to India’s national security interests’ due to the lack of democratic oversight that can be exercised over such a process (Emphasis Added). However, we believe that data localisation fails as an effective policy measure to address this problem because notwithstanding the requirements imposed by data localisation policies, foreign governments can access locally stored data through extra-territorial means, including the use of malware and gaining the assistance of domestic entities. What is required,, is a more nuanced and well-thought-out solution which leverages the power of sophisticated data security tools. 

The above analysis demonstrates that the objectives linked to national security in India’s data localisation policy can be better served through other means. Accordingly, the mainstream narrative which seeks to paint data localisation as a method of preserving national security in the sense of cyber or data security is flawed. 

III. The (Unintended) Impact on the Indian Economy

The Srikrishna Committee Report ostensibly refers to the ‘… positive impact of server localisation on creation of digital infrastructure and digital industry’. Although there is no disputing the impact of the digital economy on the growth of various industries generally, the report ignores the fact that such growth has been fuelled by the free flow of cross-border data. Further, the Srikrishna Committee Report fails to consider the costs imposed by mandatory data localisation requirements on businesses which will be forced to forgo the liberty of storing their data in the most cost-effective way possible. These costs will be shifted onto unsuspecting Indian consumers. 

The results of three seminal studies help illustrate the potential impact of data localisation on the Indian economy. The first study, which aimed at quantifying the loss that data localisation might cause to the economy, found that mandatory localisation requirements would reduce India’s GDP by almost 1% and that ‘… any gains stemming from data localisation are too small to outweigh losses in terms of welfare and output in the general economy’. A second study examined the impact of data localisation on individual businesses and found that due to a lack of data centres in India, such requirements would impose a 30-60% increase in operating costs on such businesses, who would be forced to store their data on local servers. The last study analysed the sector-specific impact of localisation, quantifying the loss in total factor productivity at approximately 1.35% for the communications sector, 0.5% for the business services sector, and 0.2% for the financial sector. More recent articles have also examined the prejudicial impact of data localisation on Indian start-ups, the Indian IT sector, the cyber vulnerability of small and medium enterprises, and India’s Ease of Doing Business ranking. 

At this point, it also becomes important to address a common argument relied upon by proponents of data localisation, which is the fact that localisation boosts local employment, particularly for the computer hardware and software industries. Although attractive on a prima facie level, this argument has been rebutted by researchers on two grounds. First, while localisation might lead to the creation of more data centres in India, the majority of the capital goods needed for such creation will nonetheless be imported from foreign suppliers. Second, while the construction of these centres might generate employment for construction workers at a preliminary stage, their actual functioning will fail to generate substantial employment due to the nature of skilled work involved. 

The primary lesson to be drawn from this analysis is that data localisation will adversely impact the growth of the Indian economy—a lesson that seems to have been ignored by the Srikrishna Committee report. Further, when discussing the impact of data localisation on economic growth in India, the report makes no reference to national security. We believe that this compartmentalisation of economic growth and national security as unrelated notions reflects an inherently myopic view of the latter. 

IV. Towards a Novel Narrative

National security is a relative concept—it means different things to different people in different jurisdictions and socio-economic contexts. At the same time, a noticeable trend vis-à-vis this relative concept is that various countries have started incorporating the non-traditional factor of economic growth in their conceptions of national security. This is because the economy and national security are inextricably linked, with several interconnections and feedback loops. 

Although the Indian government has made no explicit declaration in this regard, academic commentary has sought to characterise India’s economic slowdown as a national security concern in the past. We believe that this characterisation is accurate since India is a relatively low-income country and therefore, its national security strategy will necessarily depend upon the state of its economy. Further, although there have been objections surrounding a dismal defence-to-GDP ratio in India, it is believed that these objections are based on ‘trivial arithmetic’. This is because the more appropriate way of remedying the current situation is by concentrating policy efforts on increasing India’s GDP and accelerating economic growth, rather than lamenting low spends on defence. 

This goal, however, requires an upgradation of India’s national security architecture. While the nuances of this reform fall outside the precise scope of this blog post, any comprehensive reform will necessarily require a change in how Indian policymakers view the notion of national security. These policymakers must realise that economic growth underpins our national security concerns and consequently, it is a factor which must not be neglected.

This notion of national security must be used by Indian policymakers to examine the economic viability of introducing any new law, including the localisation mandate. When seen through this broader lens, it becomes clear that the adverse economic impact of data localisation policies will harm India’s national security by inter alia increasing the costs of doing business in India, reducing the GDP, and prejudicing the interests of Indian start-ups and the booming Indian IT sector. 

V. Conclusion

This blog post has attempted to present the link between data localisation and national security in a different light. This has been done by bringing the oft-ignored consequences of data localisation on the Indian economy to the forefront of academic debate. At the center of the article’s analysis lies an appeal to Indian policymakers to examine the notion of national security through a wider lens and consequently rethink their flawed approach of addressing national security concerns through a localisation mandate. This, in turn, will ensure sustained economic growth and provide India with the technological advantage it necessarily requires for preserving its national interests.  


*Views expressed in the blog are personal and should not be attributed to the institution.