Search interesting materials

Thursday, February 22, 2018

Towards a data protection framework for India

by Vrinda Bhandari, Amba Kak, Smriti Parsheera, Faiza Rahman and Renuka Sane.


The Supreme Court's seminal judgment in the Puttaswamy case recognised privacy to be a fundamental right, rooted in individual autonomy and dignity. It also laid down the normative grounding for a data protection law in India. The Justice Srikrishna Committee constituted by the Government is now faced with the formidable task of drafting a blueprint for India's first comprehensive, cross-sectoral data protection law. The Committee released a White Paper in November, 2017, which covered impressive ground in terms of mapping of key issues and the international landscape on data protection. The Committee also presented its provisional views spanning across issues of jurisdiction, to specific rights of individuals vis-a-vis data controllers, and the enforcement mechanism to make these rights justiciable.

In this post, we discuss some of our key recommendations in response to the Committee's White Paper. We focus here on the following three issues.

  • A principles-based primary law: The cross-sectoral nature of the proposed data protection law necessarily merits a principles-based approach for identifying the rights and protections that are suitable for the needs of a range of different stakeholders. The focus of the primary law should therefore be on identifying appropriate principles of data protection, the key areas where specific regulations need to be framed and the bounds within which those powers should be exercised.
  • Regulatory structure and governance: The usefulness of the data protection law will depend largely on its effective implementation. This requires the creation of an appropriate regulatory structure with well-defined legal processes. In terms of design, we propose the creation of two new agencies -- a cross-sectoral data protection authority to discharge regulatory and supervisory functions and a redress agency to adjudicate individual complaints under the data protection law.
  • Data protection obligations of state agencies: The state is uniquely positioned to access personal data from a variety of sources and use it in ways that can radically alter the relationship between the citizen and the state. Any exceptions and exclusions from the data protection law, including on legitimate grounds such as national security, must therefore be crafted carefully and with appropriate procedural safeguards.

A principles-based primary law

The scope of the data protection law must extend to all sectors and all entities that collect and process user data, whether in the public or private sector. That said, a one-size-fits-all model also seems ill suited, given the variations in the nature and uses of different types of data in the hands of different categories of data controllers and the potential harms that could result from it. To balance these requirements, we recommend that the primary law must be drafted in a principles-based manner while the nuances of specific data protection requirements applicable in each context can be built more gradually. This can be achieved through context-specific and sector-specific subordinate legislations.

This "principles-based" approach needs to be distinguished from a "rules-based" one, which would typically contain prescriptive details about the specific requirements required to be followed by different persons. To take a simple example, a principles-based approach towards disclosures may state that an entity "should make appropriate disclosures that enable a person to make informed decisions". On the other hand, a rules-based approach may set out not only the exact text of the disclosures, but also their periodicity, font and design. We use the example of "consent" requirement to explain this idea in the context of a data protection law.

The principle of collection limitation demands that there must be a legal basis for the collection of information, and we think consent should be a primary ground for data collection. The idea of consent, or the collection and use of an individual's personal data with their approval, flows from the principle of personal autonomy and is a core element of the right to privacy. However, it is also well recognised that the consent model suffers from many challenges, including problems of consent fatigue, information asymmetry and bounded rationality of users. Yet, we argue that instead of using these reasons as grounds for abandoning consent as a basis of data collection, the law should mandate data controllers to find means to overcome these hurdles. Accordingly, the consent principle could state that "the collection of personal data is subject to the consent of the individual and such consent must be obtained in an informed and meaningful manner". It would then be upon the data controllers to develop and adopt appropriate standards that can overcome the hurdles generally associated with the consent model and for the Data Protection Authority to supplement such efforts through its regulations. Thus, if data controllers were to rely on consent to legitimise their data collection activity, they would have to demonstrate fulfilment of this principle.

Such a principle would logically translate into a requirement of clear and simple notice, which could have different meanings in different contexts. It would also mean that certain types of conduct, for instance consent taken through coercion or in a "take it or leave it" manner, is likely to be seen as a violation of the above principle. Most importantly, having such a principle would nudge data controllers towards the use of privacy enhancing technologies, such as consent dashboards and privacy chatbots like PriBot. It would also compel them to evolve practices that take into account the literacy and language contexts of India.

Apart from consent, however, there should also be scope for other grounds to legitimise data collection. As noted by the Committee, grounds such as compliance with legal requirements and legitimate purposes of data controllers, may also be included. Here too, the law should contain appropriate principles to prevent the misuse of the legitimate purpose criteria, for instance by requiring that there should be a direct nexus between the data being collected and purpose sought to be achieved through it.

Regulatory structure and governance

While considering various regulatory and enforcement models, the Committee posits a 'co-regulatory approach' as an ideal middle ground between self-regulation and a 'command and control' regime. This is described as a system where the government and industry share the responsibility of drafting and enforcing regulatory standards -- the industry frames code of conducts, which are then approved by the government. We submit that rather than treating the industry as a separate site for framing of standards, an element of `co-regulation' should be built into the statutory framework itself -- in the form of an open and participative regulation-making process. Collaborative regulation making has particular value in an evolving and technical field like data protection, which is also driven by strong economic interests.

For the alternative of government endorsing industry codes to be meaningful, and not simply self-serving to the regulated entities, the regulator would eventually have a similarly high burden of supervision and monitoring. We therefore suggest that embedding collaboration in the process of rule-making, rather than effectively conceding that process to regulated entities, is a better approach. Within this framework, each data controller would of course have some flexibility in adhering to the principles laid down under the law or regulations, while remaining liable for penalties and redress for inadequate compliance.

In terms of regulatory design, we propose the establishment of two new statutory agencies:

  1. A Data Protection Authority (DPA) that will function as a cross-sectoral data protection regulator. It will be responsible for drafting regulations, assessing compliance by regulated entities and initiating enforcement actions against them.
  2. A Data Protection Redress Authority (DPRA) that will be responsible for adjudicating individual complaints and affording appropriate remedies to individuals.

This draws from a similar recommendation made by the Financial Sector Legislative Reforms Commission (FSLRC) in terms of separating regulatory and redress functions in the context of the financial sector. A key objective behind this design is to allow the DPA and the redress agency to focus exclusively on their core functions. This becomes particularly important in light of the principles-based nature of the proposed law. Given the large number of data collectors in the system and the individuals interacting with them, it would be unrealistic to expect the DPA to effectively discharge its regulatory and supervisory functions while also taking up the responsibility of addressing individual complaints. Staffing and financial constrains will inevitably cause one of the functions to suffer.

Another ground for the proposed separation stems from the need to avoid potential conflicts of interest. A large number of complaints on a particular issue would imply that data controllers are not acting in compliance with the legal principles, but it may also imply a failure on the part of the DPA to take appropriate regulatory or supervisory actions to curb such malpractices. It is therefore important that the resolution of any complaints should take place independent of the other core functions of the regulator. There should however be a strong feedback loop between the redress agency and the DPA for transmission of information about the kinds of complaints being received, the entities to which they relate and the underlying causes. This will enable the DPA to address such issues through appropriate amendments to its regulations or by initiating enforcement actions.

Our response to the White Paper also contains a number of other recommendations about the design and functioning of the proposed agencies, many of which draw from the recommendations of FSLRC and the draft Indian Financial Code. This includes suggestions relating to the need for a sound selection process for members of the DPA, separation of powers within the DPA; emphasis on a transparent regulation-making processes (taking into account the expected costs and benefits of proposed measures); and the need for an independent appellate mechanism.

Finally, the law must also facilitate ways to maximise synergies between the DPA and existing sectoral regulators. This interaction becomes especially important in the short run given that it may take some time for the DPA to build capacity and an accompanying body of regulations for different categories of stakeholders. Sectoral regulators could therefore take the lead in framing appropriate standards for their regulated entities, in accordance with the principles under data protection law and in consultation with the DPA. To facilitate such interactions, we recommend that the law should mandate the creation of cooperation mechanisms between the agencies. This may include consultation on framing of regulations applicable to entities in a particular sector; making a reference to the other agency while initiating supervisory actions against a regulated entity and requirement to enter into an MoU to mutually agree on the exact procedures for this coordination.

Data protection obligations of state agencies

While the daily interactions between users and commercial platforms such as Facebook and Google undeniably lead to many important concerns, the interactions between personal data and the state must be viewed with even greater care. This is due to the distinct nature and magnitude of state power. The state is uniquely positioned to access the data collected both by itself and other private sources, and use it ways that can radically alter the relationship between the citizen and the state. Data can then easily become a tool for surveillance, intimidation, coercion, and harassment, and the data protection law should be cognizant of such concerns.

The chapter on exemptions in the White Paper focuses on the types of activities that may be exempted from data protection principles, including a section on national security. However, it does not highlight the manner in which the exemption would translate into practice, and merely relies on the Puttaswamy decision's indication of national security as a legitimate aim.

While the status of national security as a legitimate aim remains fairly uncontested, we propose that the requirements of necessity and proportionality laid down by the Supreme Court in Puttaswamy need to be embedded in the law while creating such exemptions. The provisions on surveillance and national security also need to take into account development of technology that enables low-cost, mass surveillance, reducing the need to rely on physical and human resources (as noted by Sotomayor J. in United States v Jones). The mechanisms to be considered in this context may include judicial review or parliamentary oversight; other forms of systematic review of executive actions; defined time limits; and clear provisions for appeal. We realise that such principles would also have to be included in allied laws such as the Aadhaar Act and the Indian Telegraph Act.

The regular processing of data by state agencies also raises interesting questions about the appropriate liability and enforcement regime for any breach of the law. We submit that the penalties and compensation requirements under the data protection law should apply equally to public and private entities. However, certain distinguishing factors (such as the source and extent of finances) must be taken into account by the implementing authorities. For instance, the law in the United Kingdom gives the Data Commissioner the ability to impose a civil monetary penalty of up to 500,000 GBP on a data controller, whether a private or public body. The exact amount of penalty is, however, determined by a number of factors, including the impact on the entity being penalised and their ability to pay.

Way forward

The White Paper is the beginning of an important conversation around data protection in the context of state and non-state actors in India. However, in its attempt to cover such a comprehensive topic, the White Paper does not fully explain its provisional views on some of the important and complex issues being addressed by it. One is further limited by the lack of a draft Bill, in which the nuances of these issues will be fully understood.

To facilitate an informed debate, the Justice Srikrishna Committee has already taken an important first step in terms of organising consultations in major cities. It is now imperative that the Committee publish the responses received, so as to take the conversation forward. Even more importantly, the Committee must hold similar, multiple rounds of consultation after it releases a second White Paper with its final views, along with a copy of a draft data protection law for India. It is only when such a draft would be open to the public for comments and consultation, that we will be able to achieve a truly holistic and comprehensive data protection law.


Vrinda Bhandari is a practicing advocate in Delhi. Amba Kak is a Mozilla Technology Policy Fellow. Smriti Parsheera, Renuka Sane and Faiza Rahman are researchers at the National Institute of Public Finance & Policy.

CCI's order against Google: infant steps or a coming-of-age moment?

by Smriti Parsheera.

The Competition Commission of India (CCI) recently concluded its six year long investigation into allegations of abuse of dominance by Google in India. It found that Google had utilised its dominance in general web search services to limit user choice (specifically in the context of Google flight search) and impose restrictions on its search syndication partners. It also noted a third violation relating to Google's historic practice of setting fixed positions for a particular category of search results (referred to as "universal results") that were sourced from its other verticals like images, videos and maps. A few months ago, the European Commission had also levied a penalty of Euro 2.42 billion on Google for offering preferential treatment to its comparison shopping service and demoting rival services in its search results.

The consequences for Google in India? CCI has directed Google to (i) desist from assigning fixed positions to universal results; (ii) add a disclaimer while presenting its commercial flight results; (iii) not enforce unreasonable restrictions on syndication partners; and (iv) pay a penalty of Rs. 1.36 billion (USD 21.1 million) for its anti-competitive conduct. To put this in perspective, the penalty translates to less than 0.02% of USD 110.9 billion, Google's worldwide revenues in 2017.

Are these the uncertain first steps of an infant CCI venturing into technology-ville or the coming-of-age moment of a regulator that has learnt to balance innovation and competition in the digital era? This post attempts to answer this question by tracing CCI's analysis on issues of universal results, Google flights and restrictive agreements, highlighting some concerns and summarising the takeaways from this decision.

Allegations against Google

CCI's order arises from two separate cases filed by online matchmaking portal and Consumer Unity & Trust Society against Google Inc. and Google India Private Limited. In the course of the investigation, Google Ireland Limited, a key contracting entity for Google's advertising agreements, was also added as a party.

The informants alleged an abuse of dominance by Google in the general web search and search advertising markets in India, with the following specific claims.

  • Search bias: Google was using its dominance in the search engine business to promote its own results like videos (YouTube), news (Google News) and maps (Google Maps) in its search results.
  • Unfair advertising terms: Google has the largest number of search users which makes it an unavoidable partner for all advertisers who want to target their ads at those users. The informants alleged that Google was using this position to impose unfair and discriminatory conditions on its AdWords customers (advertisers who bid on keywords for ads to be displayed in Google's search results).
  • Denial of access: Google was using its dominance in the search and search advertising markets to impose unfair conditions that restricted its partners from contracting with other competing search engines. As a result, it was denying access to the market to competing businesses.

As per the scheme of the Competition Act, 2002 (Act), the complaints were first examined by the CCI to assess their prima facie merit. Finding a prima facie case of abuse of dominance by Google, CCI referred the matter to its investigative arm, the office of Director General (DG), for detailed investigation. The DG's office took about three years to complete its assessment and submitted an investigation report to CCI in March, 2015. It found Google to be guilty on all the counts raised by the informants in addition to a few others that were discovered by it in the investigation process.

CCI's reasoning and verdict

An abuse of dominance case under Section 4 of the Act requires CCI to establish that the entity in question held a dominant position in a defined relevant market and had abused that dominance through activities like imposing unfair or discriminatory conditions on others, limiting the supply of goods or services in the market, or using its dominance in one market to protect its position in another.

Relevant market analysis

CCI agreed with the findings of the DG that Google was operating in the following relevant markets: (a) market for online general web search services in India and (b) market for online search advertising services in India.

Google contested both these market definitions, arguing instead for a broader relevant market -- the wider the market the lesser the chances of Google being found to be dominant in it. Regarding general web search, Google claimed that it competes with all possible sources of information that can answer a user's specific query (about people, places, recipes, etc.) and there is no separate market for "general web search". This however ignores the fact that there are billions of webpages on the Internet, a majority of which are not known to users. Users therefore commonly rely on search engines to identify new sources of information and even access relatively well known ones. For instance, Alexa's web traffic analytics shows that 66.6 percent of the traffic to Wikipedia, the fifth most popular website in the world, comes through search engines.

Curiously, Google is also reported to have argued that "[b]ecause search is free, Google has no trading relationship with the users of its search service, and so the basis for establishing dominance is absent". CCI rejected this argument, noting that "it is not only flawed but altogether ignores the role of big data in the digital economy". In the multi-sided platform operated by Google, users offer their "eyeballs" and data in exchange for Google's "free" services, which are in turn monetised by Google through advertising revenues. To claim that only services that are directly paid for by users can constitute a relevant market would render large parts of the digital ecosystem outside the purview of competition laws, an outcome that is neither legally tenable nor socially desirable.

Assessing Google's dominance

The determination of dominant position depends on a number of factors, market share being one of them. CCI's order notes that Google has maintained a high market share in both the relevant markets but does not disclose the exact figures (these are blacked out as confidential information). Publicly available information from statcounter, however, clearly shows that in the period since 2010, Google has consistently held over 96 percent of the market share among search engines in India.

CCI also looks at other factors beyond market share. On the search side, it refers to Google's head start in crawling and indexing the web and resulting scale advantages. As explained by Matt Turck, Google benefits from significant "data network effects" -- "the more people search, the more data they provide, enabling Google to constantly refine and improve its core performance, as well as personalize the user experience". As we discuss in this paper on Competition Issues in India's Online Economy, multi-sided platforms like Google are also characterised by strong indirect network effects. CCI uses a similar logic to observe that Google's stronghold in general web search supplements its dominant position in the market for online search advertising, resulting in situation were advertisers are left with little countervailing powers over Google. In summary, CCI notes that this leads to a situation where "[t]he structure of the market is both indicative of and conducive to Google's dominance".

Abuse of dominance

Next, CCI turned to examine the specific allegations relating to the abuse of its dominant position by Google. This is also the point where the Commission significantly digresses from the findings made by its investigation unit. Unlike the DG's report, which found Google to be in violation of the Act on almost all the grounds examined by it, CCI limits its findings to the following three grounds.

  1. Fixed positions for universal results:

    Findings: Google's search results pages often contain certain "universal results" that are sourced from its other search verticals (See Figure 1 for an example). As per Google, these universal results compete with other generic blue links for the most favourable position on the results page based on their relevance. However, it also admitted that when the service was initially introduced, the display of universal results was limited to certain fixed (1st, 4th or 10th) positions as its systems were not advanced enough to determine the relevant position for such results. CCI dismisses this argument to hold that Google's historic practice of adopting a fixed position for such results was unfair and misleading to its customers who were led to believe that the responses were being ranked solely on the basis of their relevance.

    Consequences: Since Google has already discontinued this practice, CCI limits itself to issuing a desist order directing Google not to resort to such position fixing in the future.

  2. Figure 1: Universal image results in response to search term "Kullu"

  3. Commercial unit for flight results

    Findings: Google also places various "commercial units" in its search results. This refers to a demarcated ad space for displaying sponsored results relating to shopping, hotels and flights. CCI focused in particular on Google's flight unit (See Figure 2 for an example). It noted that Google provides a prominent placement to its flights unit in general search results with a link that takes the user to Google's own specialised flight search service. It found that this practice results in either pushing down or pushing out other competing vertical search services with the result of misleading users and denying them the opportunity to access the other websites.

    Consequences: CCI directed Google to display a disclaimer in the commercial flight unit box indicating clearly that clicking on the relevant link would lead to Google's flights page and not the results of any other third party service provider.

  4. Figure 2: Google Flight Unit in response to search term "Delhi to Kullu flights"

  5. Restrictions in syndication agreements

    Findings: In addition to the search and advertising services offered on Google's own website, it also enters into syndication agreements with other websites to offer its search and advertising services to them. These agreements can either the take the form of standard online contacts or directly negotiated agreements. CCI found that Google imposes certain unreasonable restrictions on its negotiated search intermediation partners -- websites that enter into negotiated agreements to use Google's services on their websites are restricted from implementing any search technologies that are "same or substantially similar" to those of Google. CCI found that this restricts Google's partners from using the services of competing search engines. It thus "creates conditions for extending and preserving Google's dominance in search intermediation".

    Consequences: CCI directed Google not to enforce the restrictive clauses in its negotiated direct search intermediation agreements with Indian partners.

As noted above, the DG's investigation had found Google to be guilty on many other counts. This included questions about Google's conduct in relation to its AdWords customers; its practice of allowing bidding on trademarks owned by competitors; and its arrangements with distributors like Apple and Mozilla to make Google the default search engine in their products. CCI, however, disagreed with the DG's findings on all these other counts.

Some questions and concerns

CCI's decision raises many important issues regarding Google's conduct in the search and search advertising markets. The design of ranking algorithms to provide preferential positions to certain types of results (without sufficient disclosures) and the imposition of unfair restrictions in commercial contracts are certainly critical issues that can have far reaching implications for online competition in India. Yet, despite agreeing with the principles behind these ends, it is hard to ignore some issues with the means adopted by CCI to reach them. This section focuses on the lack of sufficient evidence-based analysis, selective focus on flight search and CCI's own uncertainty about calculation of the penalty, all of which are factors that could expose the order to subsequent scrutiny.

Absence of robust data and evidence

The first issue, which has also been emphasised at length by the two dissenting members of CCI, relates to the absence of robust data and evidence to support the findings against Google. While the discussions in the order are sufficient to develop a strong intuition about Google's anti-competitive conduct, this intuition should ideally have been followed through with supporting data to build a water tight case. Specifically in the context of flight search, the dissenting members point to the absence of actual data about the traffic flows to the Google flight unit or to competing websites, the positions on what these websites actually appear on Google's results page and the impact that it has on consumer behaviour.

We can contrast this with the European Commission's approach in its similar case against Google. In a press release issued in June, 2017, the Commission announced that its order against Google was supported by evidence from various sources, including "(i) significant quantities of real-world data including 5.2 Terabytes of actual search results from Google (around 1.7 billion search queries); and (ii) experiments and surveys, analysing in particular the impact of visibility in search results on consumer behaviour and click-through rates". Based on this evidence, it was able to gauge the precise effects of Google's prominent placement of its comparison shopping service.

  • The traffic to Google's comparison shopping service increased 45-fold in the United Kingdom, 35-fold in Germany, 19-fold in France, 29-fold in the Netherlands, 17-fold in Spain and 14-fold in Italy.
  • There was a sudden drop of traffic to certain rival websites, to the tune of 85% in the United Kingdom, up to 92% in Germany and 80% in France.

In the present case, it is clear that the dissenting members are not disagreeing with the merits of the case against Google but the absence of sufficient data to make those claims. That being the case, the logical course for the Commission would have been to direct further investigation by the DG on these specific grounds or pursue an inquiry on its own. Both these courses were open to the Commission under Section 26(7) of the Act but their adoption would of course have meant a further delay in an already long pending decision.

Questions about the flight search analysis

The next set of issues revolves around CCI's focus on Google flight search and its prominent display on the search results page as a ground for abuse of dominance. The order examines the impact of the prominent real estate given to Google's flight unit vis-a-vis third-party travel sites like or and its misleading impact on users. However, as noted here, Google's flight service (at least at present) only offers users the option of comparing flight prices and not of directly making the bookings through Google. Therefore, the market in question is that of flight fare comparison websites and it would accordingly have been relevant to consider the impact on competing fare aggregation sites like "skyscanner" and "farecompare" instead of only those that provide flight booking services. Such an analysis would have also enabled CCI to examine a potential violation of Section 4(2)(e) of the Act, which relates to the use of dominant position in one market to protect its position in another.

Further, the order does not clarify as to why the flight search functionality is more problematic than similar commercial units displayed by Google in response to shopping or hotel related searches in India. In case of shopping results, the Commission makes a passing remark that "Google's display of Shopping Unit may not per se affect the ranking of free search results". In case of hotels, the order states that Google does not offer this feature in India even though a search for hotels on Google's India site does display a commercial unit, which has similar features to its flight search function -- it leads to another Google page that contains advertisements from hotels and hotel booking websites.

Penalty imposed by CCI

Relying on the Supreme Court's decision in the Excel Crop Care case, CCI decided to limit its penalty to Google's "relevant turnover" from India. For this purpose CCI sought information from Google regarding its revenues from different segments of its India operations, which the Commission notes was provided in an unsatisfactory manner. For instance, it is unclear from the order whether the information supplied by Google under the head "Relevant Turnover from Direct Sales in India" was based on (i) the income earned by all Google entities from end-users based in India; (ii) the income earned by Google India Private Limited from advertisers in India; or also (iii) the income earned by Google Ireland, Singapore and others from Indian advertisers.

Unfortunately, CCI acknowledges these infirmities and still goes on to determine the final penalty amount based on the unclear information furnished by Google. Given the criticality of this point, it would have been appropriate for CCI to seek more specific information from Google and, if required, provide further time for the same. It could also have used its statutory powers to elicit this information.


The Google case is an important development in India's competition jurisprudence on abuse of dominance in multi-sided technology markets. CCI's order acknowledges at the outset that "intervention in technology markets has to be carefully crafted lest it stifles innovation". It also highlights CCI's intent to refrain from interfering in specific product design elements unless the conduct in question is particularly egregious and an intervention becomes necessary to correct certain distortions.

Despite its well intentioned attempts to balance the interests of innovation, competition and consumer welfare, the decision falls short on some counts. Firstly, in an industry centered around click-through-rates, analytics and ranking measurements, the order primarily relies on qualitative and descriptive accounts to establish Google's violations. Secondly, CCI chooses to intervene in certain product design elements (universal results, commercial units) but not in others (like the AdWords ranking mechanism and trademarks bidding policy). While doing so, it fails to offer any broader guidance on the basis for demarcating general product design elements (that could also negatively impact competition) from particularly egregious conduct that merits competition intervention.

Yet, irrespective of the fate of this particular decision, actions such as these serve an important function in taming the conduct of big tech -- the threat of external regulation creates an impetus for better "self" regulation. In the past, Google amended its AdWords terms to make it easier for advertisers to simultaneously manage advertising campaigns on competing ad platforms. This was done through voluntary commitments offered by Google in relation to an inquiry by the Federal Trade Commission. Similarly, a press release by the European Commission notes that in the context of its anti-trust proceedings, Google had modified its direct AdSense contracts (with websites who use Google's services to display ads on their pages) to give its partners more freedom to display competing search ads. Recent moves by Facebook and others to control fake news on their platforms are also grounded in similar concerns.

Finally, the case lays important ground work for subsequent cases against Google and other dominant players in India's online ecosystem. As CCI's orders get challenged before the appellate tribunal and eventually the Supreme Court, we will see new jurisprudence around issues of competition in the digital economy. This will hopefully create a feedback loop for increased rigour and evidence-based analysis in future cases in this sector.


Smriti Parsheera is a technology policy researcher at the National Institute of Public Finance & Policy. She has previously worked as a researcher with the Competition Commission of India, including on the Google case. The views are personal.

Monday, February 19, 2018

Interesting readings

First, build a capable RBI by Ajay Shah in Business Standard, February 19, 2018.

Lessons from a fraud by Ila Patnaik in Indian Express, February 17, 2018.

Retrograde Gag Order from Stock Exchanges by Somasekhar Sundaresan on Wordpress, February 15, 2018.

Mr Bond makes his voice heard by Niranjan Rajadhyaksha in Mint, February 14, 2018.

Comment - Why BSE, NSE taking on the Singapore exchange is an exercise in futility by Shishir Asthana in Money Control, February 13, 2018.

Why Modi govt should worry about RSS' ability to mobilise by Nitin Pai in The Indian Express, February 13, 2018. Also see chapter 6 from On Tyranny: Twenty Lessons from the Twentieth Century by Timothy Snyder.

The Ranbaxy case and how it shows the system in India does work after all by Mihir Sharma in Business Standard, February 12, 2018.

Will LTCG+STT hollow out India's equity markets? by Mobis Philipose in Mint, February 9, 2018.

Tweaks and mandates: The wrong way to build a corporate debt market in Business Standard, February 7, 2018.

Forward, backward on trading derivatives in The Economic Times, February 6, 2018.

The Autocrat's Achilles' Heel by Alina Polyakova and Torrey Taussig in Foreign Affairs, February 2, 2018.

The Pillars That Hold India's Democracy Up Are Falling Apart by Meghnad S in Buzz Feed, November 9, 2017.

The Agency by Adrian Chen in The New York Times, June 2, 2015.

Ghost wars ran till 9/10. His new book Directorate S carries the story through till 2016.

Friday, February 09, 2018

Hollowing out of India's financial markets: Banning trading abroad is not a choice

by Ajay Shah.

For a long time, there has been a realisation that India's policy mistakes on capital controls, financial regulation and taxation will induce a hollowing out of Indian financial markets. Here is an example from May 2012. The two most important products are Nifty and the rupee, and these are increasingly dominated by overseas activity. Non-residents have a clear choice about where they wish to send their order flow and locals also are known to evade capital controls and take their custom to more competitive venues. In this week, there is an amplified concern about these problems e.g. see Mobis Philipose in the Mint and my article in the Business Standard.

These developments are good for the real economy, as superior mechanisms of financial intermediation are displacing the inefficiencies of the onshore financial system. This reduces the cost of doing business for foreign investors.

But at the same time, we in India are losing massive financial service exports as the business is shifting out of India. On the rupee, the estimated loss of revenue for India is around Rs.60,000 crore per year. Similar values are likely to prevail for Nifty.

In many developing countries, the lack of macro/finance policy capabilities gave a comprehensive hollowing out of domestic financial markets. This is the scenario that is being posed before India. 

The correct solution to this problem lies in going to the root cause, and solving our mistakes of financial regulation, capital controls and taxation. This  painstaking work has been analysed in by the Standing Council on the International Competitiveness of the Indian financial sector, which was setup by the Department of Economic Affairs in June 2013 in recognition of this problem.

Men and nations will do the right thing after trying every reasonable alternative. What are these `reasonable' alternatives?

Ban the product

I remember a time when RBI requested the UAE central bank to force DGCX to not trade INR futures. Such a ban is not in the interests of either DGCX or the UAE, and this request was not accepted.

Block the participants

RBI has tried to say to international firms operating in India: do not trade in India-related financial markets overseas. But the jurisdiction of RBI is limited. There are concerns about non-rule-of-law methods of harming firms who do not obey.  RBI and SEBI periodically try to ban PN trading.

These bans are futile as India's regulators have no ability to enforce these bans. In any case, even if the ban is effective, all that will happen is that the business will move from firms that comply with India's grab for extra-territorial jurisdiction to firms that do not care about  India's regulators.

Block the information products

Nifty is made by IISL, which is  an India-domiciled information company.

IISL is not a financial firm and is not exposed  to RBI or SEBI regulation. But perhaps non-rule-of-law techniques of coercion can be applied. Suppose  this succeeds, and IISL does not license Nifty to SGX.

SGX has numerous alternatives. SGX can go to a mom-and-pop index provider who makes a Nifty-like index: an index where 49 of  the 50 stocks are the same as those in Nifty. SGX can shift to the MSCI India index, and MSCI can gently move closer  to the Nifty composition.

If, somehow, SGX is prevented from having an effective exchange-traded Nifty product, the business will just go OTC.


Let's not lose sight of what is going on. There is a trading venue that offers lower costs in investing/trading on Indian assets. We are discussing tools for protectionism through which the cost of participating in the Indian economy is driven up. This is not in India's interests.

Our course of action should lie in solving the Indian policy mistakes of capital controls, financial regulation and taxation.

A Pragmatic Approach to Data Protection

by Suyash Rai.

A Committee constituted by the Central Government is working on a data protection law. The White Paper published by the Committee suggests that the Committee is in favor of a comprehensive law, with a number of rights and protections. Many of the stakeholders seem to be demanding an even more comprehensive law. In this essay, I present analysis of certain issues relating to regulation of data protection. These are organised around the themes of: regulatory capacity; the economics of data protection regulation; jurisdiction-related issues; the rights-based approach to regulation; the need to distinguish between data protection and broader privacy concerns; approach to data protection in government organisations; balancing regulatory clarity with flexibility to allow innovations; and regulatory governance and due process requirements. In my view, reflection on these issues will help create an effective law for data protection. Towards the end of this essay, certain specific suggestions on the legislative formulation are given. However, these suggestions, which flow from the analysis, are very tentative, because much more analysis is required.

Proposal for a data protection law

The Justice K. S. Puttaswamy (Retd.) and Anr. vs Union Of India And Ors. judgment on the right to privacy has brought privacy-related issues to the forefront of policymaking. The Order of the Court, inter alia, said, "The right to privacy is protected as an intrinsic part of the right to life and personal liberty under Article 21 and as a part of the freedoms guaranteed by Part III of the Constitution."

Among the privacy-related issues, a full-fledged policy process has been initiated on data protection. In the Puttaswamy Judgment, one of the opinions, which was signed by four of the nine judges, enjoined upon the Central Government to "examine and put into place a robust regime for data protection", which strikes "a careful and sensitive balance between individual interests and legitimate concerns of the state". Some of the other opinions also made references to the issues of informational privacy.

A Committee of Experts, chaired by Justice BN Srikrishna, was constituted while the hearings on the case were underway. The Committee has been asked to study "issues relating to data protection in India," to suggest "principles to be considered for data protection in India," and to suggest "a draft data protection bill" Following are some of the key points discussed in a White Paper published by the Committee.

  1. What is to be protected: Data protection is about protecting personal data. The Committee is deliberating what constitutes personal data. Its provisional view is that data from which an individual is directly or indirectly identified or reasonably identifiable is personal data, and this includes any information including opinions or assessments irrespective of their accuracy. The Committee is also considering whether and how to define a sub-category of "sensitive personal data", to be given enhanced protection. Its provisional view is that health information, genetic information, religious beliefs and affiliations, sexual orientation, racial and ethnic origin, caste information, financial information may be treated as sensitive personal data. Further, the White Paper says that, in other categories such as philosophical or political beliefs, an assessment may be made whether a person has an expectation of a high degree of privacy.
  2. Exemptions: The Committee has proposed wide exemptions for household purposes, journalistic/artistic purposes, and literary purposes. They have also proposed exemptions for: data processed for the purpose of academic research, statistics and historical purposes; information collected for investigation of a crime, and apprehension or prosecution of offenders; information collected for maintaining national security and public order. To ensure that exemptions are not misused, the Committee is considering safeguards, such as a review mechanism.
  3. Protections and Rights: The Committee's provisional view is that informed and meaningful consent for processing personal is central to protecting personal data. Other than clearly given exemptions, all other processing must be after consent. The Committee is considering what would constitute valid consent. The Committee has proposed certain preventive measures, such as data limitation (only processing the data required for a task), purpose limitation (data must be collected for a specified purpose and used only for that purpose), storage limitation (data to be erased after its purpose is met), and so on. The Committee is also considering certain individual rights. The first set of rights are: right to seek confirmation, right to access the data, and right to rectify the data. Given that such rights impose costs, the Committee has proposed allowing fees to be charged for them. The second set of rights are: right to data portability (data of an individual be made available in a universally machine readable format or ported to another service provider); right to not be subjected to a decision based solely on automated processing; right to object to processing for the purpose of direct marketing. The Committee also seems to endorse a right to be forgotten. The White Paper has a discussion on a general right to object to processing, but provisionally finds it unsuitable for India.
  4. Jurisdiction: The Committee has floated three alternatives: a) Cover cases where processing wholly or partly happens in India irrespective of the status of the entity; b) Regulate entities which offer goods or services in India even though they may not have a presence in India; c) Regulate entities that carry on business (i.e. consistent and regular activity with the aim of profit) in India. The Committee has also raised issues of scope relating to applicability of the law to data relating to juristic persons such as companies, differential application of the law to the private and the public sector, and retrospective application of the law.
  5. Differential obligations: The Committee is also considering greater obligations for entities that create more risks. The additional obligations on such entities may include: registration, data protection impact assessments, data audits; and a designated Data Protection Officer.
  6. Institutional mechanism: The Committee has proposed a Data Protection Authority (DPA) for implementation of the law. It would set standards, monitor compliance, and take enforcement actions. It would also work towards generating awareness. The Committee seems to be in favor of a co-regulation model involving close industry participation. The Committee's tentative view is that accountability should not only be enforced for breach of data protection obligations, but also, in certain circumstances, it could be extended to hold data controllers liable for the harms that they cause to individuals even without violation of any other obligation.

    For redress, the Committee has proposed a multi-tier system, wherein an individual would first approach the data controller, and if the data controller fails to resolve the complaint, the individual may file a complaint with the data protection authority. DPA may also initiate action against a data controller on a suo motu basis. The Appellate Tribunal under the IT Act may be the appellate forum for any decision of DPA. DPA may be given the power to impose civil penalties as well as order the defaulting party to pay compensation up to a threshold. Appeals against an order granting or rejecting such compensation, and compensation claims above the threshold may lie with the National Commission Disputes Redressal Commission.
  7. Principles: The Committee has endorsed seven principles to underpin the law: a) Technology agnosticism (flexibility to take into account changing technologies and standards of compliance); b) Holistic application (cover both private sector entities and government, with differential obligations for legitimate State aims); c) Informed consent (consent must be informed and meaningful); d) Data minimisation (data that is processed ought to be minimal and necessary for the purposes for which it is sought and other compatible purposes beneficial for the data subject); e) Controller accountability (the data controller to be held accountable for any processing of data); f) Structured enforcement (enforcement by a high-powered statutory authority with sufficient capacity); g) Deterrent penalties (penalties to ensure deterrence).

This suggests that a comprehensive regulatory regime, with a wide range of protections enforced by a powerful regulator, is in the offing. Going by the minutes of the consultations held by the Committee and some of the submissions to the Committee that are publicly available, the thrust of a plurality of stakeholder comments is to further expand the scope of the proposed law. More and more rights are being recommended. More preventive measures are being proposed. And more powers are being demanded for the proposed regulator.

It should be obvious that enacting a data protection law by itself will not ensure data protection. A regulatory law works through the regulatory system established by it. Actual data protection outcomes will depend on how effectively and efficiently the regulatory system regulates the market processes. I would go so far as to argue that for good outcomes, implementation matters more than what the law promises. In this journey from law to outcomes, regulatory capacity is the key.

The law and regulatory capacity

The proposed DPA will make regulations, monitor compliance, and take enforcement actions. On building regulatory capacity in DPA, I would like to make three India-specific points: first, this regulator will come up in the backdrop of relatively low regulatory capacity in India, compared to countries presently implementing advanced data protection laws; second, given the nature of data protection regulation, the regulator will find it very difficult to build capacity; third, the mismatch between the capacity and the mandate of the regulator can create poor outcomes, such that giving a broad mandate may produce worse outcomes than giving it a narrow mandate.

Relatively low regulatory capacity in India

The following chart shows percentile ranks (0 - lowest; 100 - highest) on "regulatory quality" for India and the countries whose laws the Committee has most frequently cited in the White Paper. This ranking is from the World Governance Indicators (WGI) published by the World Bank. For "regulatory quality" in India, data sources used were: Bertelsmann Transformation Index; Economist Intelligence Unit; Global Insight Business Conditions and Risk Indicators; Heritage Foundation Index of Economic Freedom; IFAD Rural Sector Performance Assessments; Institute for Management and Development World Competitiveness; Institutional Profiles Database; Political Risk Services International Country Risk Guide; World Economic Forum Global Competitiveness Report; World Justice Project.

Figure 1: Percentile rank on Regulatory Quality (Source: World Governance Indicators, World Bank)

India ranks much lower than the countries cited in the White Paper. Almost all these countries are close to the top rank (Singapore ranks number 1). Usually, on most indices of state capacity India ranks close to the median. Such rankings and indices are not precise, scientific measurements of capacity, but they are useful indicators of relative capacity. It is safe to say that regulatory capacity in India is much lower than that in other countries with advanced data protection laws. Why is this so? A variety of factors may determine State capacity in a country: organisation design and management; political system design; basis of legitimisation; and cultural and structural factors. Many of these factors are shaped by contingent social and political processes over long periods of time (see Francis Fukuyama's work on this: "State building: Governance and world order in the 21st century" and "The Origins of Political Order: From Prehuman Times to the French Revolution"). Further, many of the factors are not within the control of any one organisation.

The question, then, is: how should this fact of relatively low regulatory capacity inform the formulation of a data protection law? To answer this question, we need to first move from this general observation on regulatory capacity in India, to the specific nature of activities involved in data protection regulation, and the kind of capacity required to perform them. This may help us understand the specific challenges of building capacity in the proposed DPA.

Challenges of building regulatory capacity for data protection

Lant Pritchett and Michael Woolcock, in their paper "Solutions when the Solution is the Problem: Arraying the Disarray in Development", provide a framework to understand the challenges of capacity building. They analyse activities in terms of how discretionary (i.e. to what extent decisions will be made on the basis of information that is important but inherently imperfectly specified and incomplete) and transaction-intensive (i.e. the number of decisions required) they are. They find that it is most difficult to build real capacity for activities that are highly discretionary and transaction-intensive.

Many activities involved in Data Protection Regulation score high on both: they will be highly discretionary and transaction-intensive. The level of discretion may vary from one activity to another. Here are a few examples:

  • Data breach is a kind of problem that may require relatively less discretion to identify when it occurs, as there is a limited space for disagreement on whether there was a breach. However, the same problem can require more discretion if "preventive" actions are to be specified to avoid breaches. Experts can have wide disagreements on the best ways to manage the risk of data breach in different contexts.
  • Regulating to ensure informed consent can require a considerable amount of discretion, because, to be implemented effectively, it will require complex assessments about whether the consent was truly informed and meaningful.
  • Data minimisation is ensuring that no more data should be processed than is required for a task. This may require complex assessments to be made about the data required for a given task. Even if the regulator relies on self-assessments or third party audits, these assessments and audits will still need to be evaluated in a variety of contexts. Such judgments require sophistication and knowledge. A wealth manager, for instance, often collects and processes a large amount of personal data. There will always be considerable discretion in assessing whether data minimisation is being achieved.

Transaction intensity in data protection regulation arises out of its monitoring and enforcement functions, which will require directly or indirectly monitoring numerous events in a larger number of data controllers and processors across a number of sectors, and taking decisions about them. The transaction intensity is also shaped by a unique type of moral hazard problem that is seen in this domain. This problem, which is discussed in detail later in this note, arises out of the fact that "personal data" is not a finite resource to be protected. Users can, by sharing data and creating more personal data by online activities, change the scale of the problem for the data protection regulator.

The combination of these two characteristics (highly discretionary and transaction-intensive) makes it more difficult to build capacity, because it is not about appointing a few capable individuals exercising discretion (eg. Monetary policy) or about managing a large number of persons performing mechanised tasks (eg. Aadhaar enrolment; immunisation). Discretion is easy to abuse, and it also means that mistakes are not immediately seen as mistakes. Transaction-intensity poses the challenges of achieving good performance in a larger number and variety of situations.

DPA will have to evolve the organisation form suited for performing these functions in India’s context. For instance, DPA may choose to devolve many activities to self-regulatory organisations or recognised aggregators, leaving to itself only some of the activities, such as standard-setting, exception-handling, etc. But such strategies only change the type of capacity needed to be built. For instance, regulating aggregators is a different kind of challenge. The substantive responsibilities will remain with the DPA. So, it is advisable to be modest about expected capacity in the DPA during the initial years.

The mandate given to the authority may affect its ability to build capacity

In the initial years, the DPA will have low capacity. It is important to avoid mistakes that impede the process of building real capacity over time. The most common mistake is to give a regulator a broad mandate (a combination of expansive jurisdiction and a large number of varied responsibilites) and draconian powers in its early days, when its capacity is low. The possibilities can be depicted in the following matrix.

The Capacity-Mandate Matrix

Narrow MandateBroad Mandate
High CapacityQuadrant IIQuadrant I
Low CapacityQuadrant IIIQuadrant IV

Certain clarifications regarding the matrix are worth stating. First, although only four possibilities are shown, it is obvious that there is a continuum along both variables. Second, capacity is not a static phenomenon - some organisations perform well under stress, while others perform well during normal circumstances but collapse in situations of stress. Third, different types of capacities are required for different kinds of functions and responsibilities. The limited objective of the matrix is to highlight the choice to be made with regard to the initial mandate given to the DPA.

To produce good outcomes, there needs to be some correspondence between capacity (the type of capacity and its performance under stress) and mandate (jurisdiction and responsibilities, and the possibilities of stress). Since a new regulator will have low capacity during the initial years, the choice that the Committee has to make in its recommendation is between Quadrants III and IV. Beginning in Quadrant IV (low capacity and broad mandate) may lead to implementation failures:

  • Capacity collapse under stress: Government agencies differ from private firms in a number of ways. They do not have profit as a key indicator of performance, and must develop complex ways of measuring success and holding the staff accountable. They are usually not able to raise and allocate financial resources freely. They are not able to hire and fire easily. They are not able to procure goods and services without going through complicated processes. They need to be responsive to demands and interests of a variety of stakeholders in the society. They must constantly build and maintain political legitimacy, or they may be rendered irrelevant.

    In the context of these constraints, if a regulator begins in Quadrant IV, the huge mismatch between the mandate and the capacity, the overly optimistic expectations of the pace of improvements in outcomes, and unrealistic expectations about improvement of capacity would lead to stresses and demands on systems that will affect capacity-building in the regulator (for a discussion on failures due to "premature load bearing", see: Pritchett, Lant, Michael Woolcock, and Matthew Andrews. "Capability traps? The mechanisms of persistent implementation failure." (2010)). It is difficult enough to build capacity to deliver on a narrow mandate. With a broad mandate from day one, the regulator may never get a chance to carefully build capacity to perform its functions. It may always remain in coping mode, in face of expectations it cannot really fulfill. This may open the space for two pathways of implementation failure: preferring form over function and/or misuse of powers.

  • Preference for form over function: To maintain legitimacy, the regulator may simply imitate the forms of modern institutions without actual functionality. Regulators, like any government institution in a political society, need to gain and maintain legitimacy in the society. In face of expectations that are impossible to meet, a regulatory organisation may "mimic" forms of organisation and procedures, without functionally performing its role and producing the desired outcomes. (for a discussion on how institutions in contexts of high expectations and low capacity often choose to neglect actual performance of functions, and focus on mimicking forms of well-performing institutions, see: Pritchett, Lant, Michael Woolcock, and Matthew Andrews. "Capability traps? The mechanisms of persistent implementation failure." (2010)) This is a natural response when legitimacy is to achieved in a context of low capacity, great expectations and conflicting interests. The alternative is to achieve legitimacy through actual performance, but this is very difficult if the mandate is broad. So, the staff of the regulator may respond by following rules and procedures but not truly concern themselves with the outcomes. This does not yield actual outcomes. At best, it only creates a perception of performance.

  • Misuse of powers: a regulator with a broad mandate is usually also given draconian powers. As the organisation starts deriving more of its legitimacy by form and posturing, rather than by actual performance in delivering outcomes, this decline in integrity may also lead to inefficient and/or unfair use of powers. For instance, when faced with violations, it may be tempted to deploy a heavy-handed approach, using outright bans and disproportionate penalties, just to get political legitimacy. To some extent, this problem can be overcome by placing due process requirements on the regulatory authority (discussed later). However, in situations of capacity collapse and decline in integrity, these checks and balances may have limited efficacy. It is, after all, difficult to hold an organisation accountable to do the impossible.

    The risks that may emerge from a DPA misusing its powers are enormous. DPA would have considerable powers to intervene in private transactions. It could construct large-scale data surveillance mechanisms in the name of monitoring compliance with regulations. Such an Authority, if it starts abusing its powers, can do a lot of damage. Its employees will potentially intrude into many many kinds of transactions to try and decide questions of consent and standards of conduct. In doing so, they will access personal data to an extent that no other regulator currently does. This can make the dream of data protection go sour.

A regulatory organisation beginning in Quadrant IV risks being stuck in low capacity. Worse, it may lose integrity, and end up focusing more on appearance than on performance, preferring form over function. Worse still, it could start misusing its powers. So, moving from Quadrant IV to Quadrant I would be difficult. Further, even if the political leadership sees the problems and seeks to map expectations to actual capacity, moving from Quadrant IV to Quadrant III is not politically feasible, given the politics of reducing protections, especially in face of fierce activism that surrounds such issues. It would, therefore, be a mistake to place a new regulatory agency in Quadrant IV, i.e. hobble it with a broad mandate when it has little capacity. This will almost certainly produce poor outcomes.

It would be better if the DPA begins in Quadrant III (with a clear and narrow mandate), moves to Quadrant II by building capacity to deliver on its narrow mandate, and then, over time, moves to Quadrant I. As the regulatory system demonstrates ability to solve problems, its mandate may be broadened. We must resist the temptations of Quadrant IV. The law should be closer to Quadrant III, and lay the foundation for an effective regulatory regime for data protection. This raises the question: what is a "narrow" regulatory mandate? This is a difficult question to answer, but one that must be answered. Some of the analysis in this note may help identify the basis for narrowing the mandate, but much more work and discussion is required to come up with a suitable Quadrant III formulation.

The Economics of Data Protection Regulation

Economic analysis can inform the design of the data protection law by pointing at: how incentives may be shaped by the law; how the economics of purpose and risk may help prioritise allocation of regulatory resources; and how mandating economic analysis may help avoid wrong regulatory choices.

A unique moral hazard problem

What is to be protected under a data protection regime is "personal data". This data is to be protected from breaches, unapproved processing, etc. However, unlike, say, money, there isn't a finite amount of personal data to be protected. Users can share the same personal data with many data controllers. Users can also create more personal data by online activities. Each instance of sharing or creating personal data adds to the risks of data protection for the user, and thereby to the scale of the problem for the data protection regime. This ability of the users to significantly expand the very field of regulation makes data protection a unique regulatory challenge. Therefore, prudence exercised by users in sharing and creating personal data is critical for data protection, much more so than it is in any other field of regulation. The law should not give the users incentive to be imprudent, especially in decisions that they are well-placed to take.

The data protection regime could shape the behaviour of users. If the regulatory framework puts greater responsibility on the regulator to assure protections by taking preventive measures and to give quick redress based on individual grievances, users would have less incentive to be prudent while sharing and creating personal data. This a moral hazard problem - just because someone else is giving protection against the risks, one is likely to take more risks. On the other hand, if the regulatory approach is sharply based on user responsibility and consent, and lets users incur costs of their imprudence, we can expect more prudence from users.

Take the example of data minimisation. One construct could be to have consent-based data minimisation, wherein it is the responsibility of the user to determine whether data minimisation is being achieved at the time consent is sought. Such regulation would focus on ensuring that the users get the necessary information about the data to be processed for a given task, and the monitoring by regulator to ensure that processing is consistent with the consent. Another approach to data minimisation could be to empower the regulator to assess whether processors are processing more data than is required for a task, irrespective of whether consent has been given for such data to be processed. The latter construct would intensify the problem of moral hazard.

However, in certain areas, preventive measures by regulator are required. For instance, preventive measures may be required to maintain minimum standards of data security, because users will typically not be in a position to assess this at all, and harms caused by a breach may be significant. Such preventive powers should be given only where they are necessary.

One could argue that moral hazard is not unique to data protection regulation. In banking regulation, for instance, the State promises to make efforts to keep banks reasonably safe, and takes preventive measures to keep this promise. This gives the depositors a certain level of comfort, which makes them less likely to be careful while choosing the bank to put their money in. However, this effect works within a limited, defined space of banking, which is comprised exclusively of licensed banks. The regulator controls entry into and exit from that space. Contrast this with, for instance, mobile applications - the moral hazard would encourage behavior that will expand the scale of the problem in a manner that cannot be controlled by the regulator. In data-based applications (online or real world), it is infeasible to ensure an exclusive, licensed field of protected activities. So, users would assume that the regulator will protect them, and this may lead them to be more indiscriminate in sharing and creating personal data.

It might be tempting to point at problems of achieving informed consent and to advocate regulator-led measures of data protection that limit the role of consent and focus more on ex-ante, preventive measures monitored and enforced by the regulator, but this is a road to less prudence by users and ever-increasing responsibilities and powers of the regulator. Acknowledgment of this interplay between prudence of users and the responsibilites of the data protection regime should inform the nature, scope and extent of protections promised by the law.

One could argue that protections ensured by the regulator allow us to participate more freely, and not giving extensive protections may create a chilling effect, but there can also be a good kind of chilling effect, which makes us careful about sharing and creating personal data. Focus on user responsibility is essential to achieve the good chilling effect, and avoiding the bad chilling effect.

Purpose vs. Risk

There can be disagreements on the specifics, but it should be easy to see that all purposes that require processing of personal data are not equally important. For example, it can be argued that certain recreational applications such as mobile games are not as important as healthcare services. If we acknowledge such distinctions, we could argue that pragmatism demands that regulatory emphasis be given to providing greater protections for personal data in more important services, where there is relatively less user discretion. This also ties in with the importance of user responsibility. Users who freely share data with applications that are generally considered to be less important (eg. games that require a lot of personal data) should deal with the consequences of their choices. It is not a good use of limited regulatory capacity to ensure data protection in such situations. Similarly, greater emphasis may be required for sensitive personal data, as has been discussed in the White Paper.

Another distinction that can be useful is that between processing of personal data for personal benefits, and processing that is beneficial for the society. Personal data is, in most instances, a private good, and the person whose data is protected gets most of the benefits of the protection. In some instances, however, there are positive externalities of data sharing: a person sharing data that benefits others (eg. sharing data about blood group and contact details). On the margins, regulatory resources may be better used in protecting personal data with large externalities. Economic theory suggests that consent for such processing will be in under-supply. By augmenting protections, such activities can be encouraged.

Market failures, real problems, and effective regulation

The primary reason for regulatory intervention in markets is to address problems created by market failures. Market failures relevant for data protection are: market power (a controller/processor enjoys dominant market power that it can abuse), asymmetric information (user does not have information required to take the right decision), externalities (costs of mistakes by a controller/process are inflicted upon the users). The problems relating of abuse of dominant market position are usually addressed in competition laws, and should ideally not be included in a data protection law.

Market failures only create potentiality of harm. Often, there is no incentive for the controller/processor to take advantage of market failures, because other incentives are stronger. For instance, the market may reward more privacy-friendly providers, leading them to voluntarily protect data of users. So, any regulation must be in response to a clearly identified and significant problem arising out of a market failure. Some of the protections being envisaged do not appear to be based on such significant problems that they justify creating a general right in a law. In my view, these are: being subject to a decision based solely on automated processing; right to object to processing for direct marketing; and the right to be forgotten.

The problems of automated processing discussed in the White Paper seem to arise out of genuine mistakes, and may be resolved without creating a general right to not be subjected to automated decisions. There seems to be no intent to cause harm. The right to be forgotten is not a response to a market failure, but is coming from an extreme interpretation of privacy, which, as the White Paper discusses, allows costs to be imposed on the society so that a person can be forgotten. There can be other grounds to support such a right, but the case is weak on economic grounds. On direct marketing, there is a market failure in the form of negative externality imposed on those not seeking the good or service being sold. However, it is not clear that the problem is so grave that the State's coercive powers are required to uphold a general right against it.

Finally, even if there are significant existing or emerging problems due to market failures, it is important to demonstrate that the proposed interventions will be effective in addressing the problems. This calls for analysis of regulatory impact before regulations are made, and analysis conducted periodically to measure continued effectiveness. Before making a regulation, such analysis usually includes projections for several years into the future. This can help focus regulatory resources on significant problems that are already there or are likely to arise. Giving a general right in the law presumes such analysis has been conducted for all the problems that the right is supposed to be addressing. Perhaps some of the protections need not be formulated as "rights".

On the rights-based approach to data protection

Following the example of other countries, the Committee seems to have used the rights-based language for most of the protections it seeks to recommend in the law. This issue requires a careful rethink, because this has consequences for the way the regulatory system will evolve.

One way to think about this is to distinguish between protections that are required for the market processes to function well, and the protections that are outcomes of the market processes. Informed consent is a precondition for the market to produce good outcomes, because such consent is necessary as an input to the market processes about what the consumers want. Informed consent signals what the consumers see as useful trade-off between protecting their privacy and using their data productively. On the other hand, for instance, the extent to which a person is subject to a wrong decision solely based on automated processing is an outcome of the market processes.

The word "right" gives a sense that each individual can invoke the State's coercive powers to claim what is being called a right, without regard to the costs, and irrespective of the scale of the problem. In data protection, the right to informed consent is perhaps the only such right, and it can be said to entail a few participation rights, such as confirmation, access, and rectification, which are required to give effect to a proper right to informed consent. Even the basic participation rights may be exercisable only at a cost, and therefore fees should be allowed for confirmation, access and rectification. These fees should be regulated, so that they are not prohibitive.

The remaining protections, if any, may be given to the regulator as objectives to be achieved at aggregate level, but not given as rights to individual users. So, for such protections the focus of the regulator would be on achieving good outcomes in the aggregate, rather than upholding exercise of these rights by individuals. For instance, in a rights-based framework once a person exercises a right to object to a decision based solely on automated processing, the regulator would be required to ensure that this is done in all instances of exercise of this right. Similarly, for issues like direct marketing, the regulatory capacity would be misapplied in trying to secure exercise of rights by a number of individuals. Instead, it would be better for the regulator to specify regulations that would reduce the instances of excessive harms caused to users by automated processing or digital marketing. Individual rights-based approach should only be used for the basic rights required to give each user a reasonable control over her personal data.

Take the example of direct marketing. Firms conduct direct marketing because it connects them to persons who become their consumers, which also means that many consumers gain from the process. So, the society on the whole is better off because of direct marketing. The problem, however, is that an externality is being imposed on those who receive calls they are not interested in. Since many consumers value being let alone sometimes, there are market-based solutions to this problem. There are call filters (eg. Trucaller), email filters, etc., which are available for such consumers. Consumers can minimise the problem by blocking calls and unsubscribing from emails and message from particular sources by putting in a little effort. Framing this issue as an "individual right", and bringing in the State's "monopoly of coercion" into this situation may be excessive, and would discourage users to solve this problem by market-based solutions. If a generic right for this purpose is created, it would be included in the redress and enforcement mechanisms, and these mechanisms may be burdened by what is essentially a small problem. Doing so would also favor those who are better placed to pursue the redress and adjudication route. Instead, the regulator may be given an objective to improve the system of processing for direct marketing, so that situations where excessive costs are imposed on certain individuals are minimised. This can be done by setting standards for processing for direct marketing, which help minimise the "mismatch" problem in the aggregate.

Jurisdiction-related issues

Jurisdiction issues could be territorial, sectoral or based on type or size of organisations to be regulated.

Territorial jurisdiction issues

The online world is truly global. Most of the applications that we Indians use are hosted abroad, and offered by organisations with limited or no physical presence in India. For instance, Facebook does not have a data centre in India, and most of its software development is also done abroad. This poses difficulties for monitoring and enforcement by the proposed DPA. Establishing actual jurisdiction for the purposes of regulation and supervision requires having an identifier for the organisation (eg. registration), a line of communication with the organisation, being able to inspect the databases and softwares, and having an entity on whom penalties and other enforcement orders can be served. While this is relatively easy to achieve for organisations where processing already happens in India, it is difficult and expensive to establish jurisdiction over organisations that conduct processing abroad. The costs of establishing jurisdiction may vary depending on the type of entity. The question is: why would anyone agree to be regulated by a DPA in India? Whether a foreign organisation providing an online service will submit to regulations in India will depend on the disincentive of not doing so.

In finance also, for instance, there is a jurisdiction problem. It is potentially easy to get a financial service from a service provider abroad. It has been considered important to establish jurisdiction over any firm offering financial services for consumers in India. So, across sectors, there are prohibitions on offering financial services without authorisation from a regulator in India. In 2013, when recommending wide-ranging financial sector reforms, the Financial Sector Legislative Reforms Commission had also recommended that no person should be allowed to offer financial services in India without authorisation by a regulator.

China seems to have taken a similar approach for the internet, and ended up creating a parallel internet, wherein a large number of websites and applications are banned simply because they do not play by the rules made by the country. One could argue that this is reasonable, as each country has the right to define what kind of internet access its citizens should have. However, the costs of exercising this right are considerable, as this may lead to a large number of bans, and cut India off from larger parts of global flow of online services. So, if we want to establish jurisdiction over foreign firms collecting data from Indians, it would require creating a strong disincentive, such as a ban, for the controller/processor that does not give jurisdiction to the DPA. Even if we limit this to, say, "important" or "sensitive" personal data, it can create problems. For example, many patients from India send their medical information for second opinions from medical establishments abroad. This is usually done through some hospital in India. If the DPA insists that each such foreign establishment must register with it or such data cannot be shared, this would deny an important service to the patients.

In my view, it would be better to begin with regulating entities that are already processing data in India. This itself will need considerable discretion to be exercised, as has been seen in controversies around “permanent establishment” in tax cases. At the margins, there will be differences of opinion about when an entity can be said to be based in India. However, giving a regulator powers to take draconian measures to actively establish jurisdiction over entities based overseas may lead to excessive bans, especially when the regulator has low capacity, because capacity is required to determine suitable regulatory strategies for establishing jurisdictions by other means.

Sectoral jurisdiction

As the data protection authority will pursue its objectives across all sectors, this can raise conflicts with regulators. For example, in banking, securities markets, payments, etc, the data security issues are regulated by the respective regulators, because this is essential to these services. For instance, the RBI has recently established a subsidiary to work on data security issues. These services are largely operated through online systems, and a large part of prudential regulation is about ensuring security of these systems. If a payment system is breached, it would have direct financial consequence. The personal data in this case is mainly the financial data. When it comes to enforcement actions, it would be diffiult to disentangle data protection concerns from sectoral concerns. For instance, should the DPA be given the power to ban an RBI-licensed payment service provider, because of data protection concerns? Will DPA be in a position to consider the wider ramifications of such an action? Similar concerns can be raised for other sectors as well.Another issue in this context is that, even though the present data protections in those sectors are probably inadequate, there is existing regulatory capacity in some of the sectors.

Perhaps, a solution is to require the DPA to make regulations/standards in consultations with respective regulators, and once the regulations/standards have been specified, the sectoral regulators could supervise and enforce the law and the regulations. The respective regulators could do so in the course of their routine supervision of their sectors. I think this could be done for: financial firms, telecom service providers, internet service providers, etc. This may not appear to be a "clean" solution, but such aesthetic concerns should be weighed against the benefit of freeing up capacity at the DPA to focus on other sectors, and avoiding unnecessary conflicts. Also, since these regulators are in any case supervising their sectors, the additional capacity required to monitor and enforce data protection standards would probably be less than building the capacity for these sectors in the DPA.

Jurisdiction over small organisations

Given the scale of our country, it would be impractical to seek implementation of this law in every retail store and small firm. This is not to say that there are no data protection risks arising from small enterprises. But to begin with, the system should focus on achieving good outcomes with larger organisations. Small organisations should be exempt from the law. Largeness here should ideally be in terms of the amount of personal data controlled or processed, but proxy indicators, such as number of consumers, may be used to define a threshold.

Need to distinguish between data protection and broader privacy concerns:

The Committee is mandated to "study various issues relating to data protection in India". In my view, issues such as data portability, protection against being subject to a decision based solely on automated processing, and the right to be forgotten are not strictly data protection issues. They are data-related issues, but they have little to do with protection of personal data.

  • Data portability is not necessary for data protection, even though it may be good for the users to be able to shift from one controller/processor to another. This is a competition issue, as lack of portability hampers competition in a market. Arguably, denial of portability at a reasonable charge is an example of anti-competitive behavior. Further, the cost to the economy of securing a general "right" to data portability may be enormous, and a careful analysis of costs and benefits is required.
  • Being subject to a decision based solely on automated processing can sometimes become a problem, if it leads to a wrong decision. However, this is not related to protection of personal data. Automated processing can have benefits as well as costs. The examples given in the White Paper (person wrongly identified as IRA leader; loss of jobs, car licenses or voting rights because of wrong identification) are of situations where the automated processing led to a mistake. In such situations, there is no incentive for the processor to penalise the person. Since these are mistakes, is State intervention by creating a general right really required? In any case, this has little to do with data protection, and if it is being considered, this is the kind of protection that must be subjected to cost-benefit analysis.
  • The right to be forgotten: The White Paper seems to suggest that this right was endorsed by the Puttaswamy judgment. In the judgment, only one opinion discussed this right, and it cannot be reasonably considered to be the majority's opinion on the matter. EU has come to this big shift in the conceptualisation of the relationship between a person and society after a long process. Even in EU, the the right to be forgotten was replaced by a more limited right to erasure in the version of the GDPR adopted by the European Parliament in March 2014. We in India should not rush into such conceptions of privacy. In any case, this is not strictly a data protection issue.

Approach to data protection in government organisations:

In a way, data protection in government organisations is more important than in private organisations, because a lot of the personal data that government organisations have was obtained under the implied threat of the coercive power of the State. However, experience from other sectors (eg. banking) suggests that enforcement measures that are usually effective on private entities become less effective on government organisations. The penalties that are used by regulators to coerce the regulated entities to follow the regulations work less effectively with government organisations. Monetary penalties ultimately impose a loss on the taxpayers. Criminal cases are often difficult to initiate against civil servants, and in India, because of the way jurisprudence has developed, a larger number of persons working in government organisations are considered to be civil servants.

In principle, neutral application of law to both private and public sector is good, and this should be a principle underpinning the proposed data protection law also. However, there is also a need to think about other ways of ensuring data protection in the context of government organisations. Once the DPA is established and it builds capacity, it could become an advisor and reviewer of data protection policies in government organisations, so that its expertise is used to prevent mistakes from being made. It could also serve on the boards of government organisations processing a lot of personal data (eg. UIDAI). The law should contain an enabling provision to allow government to appoint the DPA to periodically review the data protection-related policies of government organisations, and have audits of their implementation conducted under DPA's supervision.

Regulations, Flexibility and Innovation

Regulatory systems work well when there are clear regulations that need to be followed, and employees of the regulator, the regulated entities, and the consumers have clarity about them. It is good to have clarity and certainty in regulations. However, this rules-based system comes at the cost of less flexibility. Once a regulator specifies a regulation, there can be little room for innovation that violates the regulation in word, even if it follows it in spirit. This is a perennial tension, but in data protection regulation, there is probably a deeper tension.

At the heart of a consent-driven data protection system is a trade-off between valuing one's privacy and valuing beneficial uses of one's personal data. Technology has multiplied the ways in which a person can use her personal data for deriving economic and social benefits. The use, of course, needs to be based on consent of the user. When a user is giving consent, she is supposedly making some calculation about how she may benefit from that consent. However, often, it is not obvious beforehand what kinds and scale of benefits can be gained by sharing certain kind of data. The users may be able to make a better choice if they see examples and demonstrations. However, a robust data protection regime may limit possibilities of innovation without explicit consent. So, there can be a logjam - users may not give consent without seeing demonstration of benefits, and processors may not be able to innovate without access to a critical mass of data. The logjam is for a good reason - both data protection and innovation matter. This is just one example, and there can be many situations where regulation may restrict innovation that could have led to better solutions for both data protection and beneficial use. For instance, what kind of a notice and consent process will work is an issue over which innovative solutions can be found.

One way to overcome such problems is to create a space within the regulatory system to allow limited scale innovations, where some regulatory exemptions are given. This "regulatory sandbox" needs to be provided in the law itself. Typically, a regulatory sandbox involves giving the regulator the power to oversee a closely supervised cohort of innovations for which certain regulatory exemptions are given. Once their lessons are documented, they may lead to modifications in regulations to allow innovations. This is a participatory approach where regulator and private participants work closely to help innovation happen. However, for this to happen, the law needs to empower the regulator to create these "safe spaces for innovation that achieve the objective of data protection while enhancing productive uses of data.

Need for sound regulatory governance and due process to be required by law

As a regulator, there are three types of actions that the DPA will take: drafting of regulations/standards; executive functions of inspection, investigation, and recommending penalties or compounding violations; and the quasi-judicial function of adjudication of disputes. Regulators are mini-States that perform all three functions. This creates potential for abuse of powers. The law should provide checks and balances to ensure that these powers are used properly. This requires two types of provisions: regulatory governance of DPA, and due process to be followed by DPA.

The law should provide for a good design of the Board of the DPA. The law should also give the processes and rationale for appointing or removing board members. This is important to maintain independence of the DPA. For its independence, it is also important that the funding process for the DPA is given in the law. Further, for accountability, it is important that the DPA be mandated to make annual plans, and publish annual reports that include performs on the previous years’ plan. Each type of regulatory action should be taken only after following due process, which should be laid down in the law. Independent authorities, such as the proposed DPA, have the power to be a judge in their own cases, i.e. they have their own officers adjudicating violations which have been investigated by the officers of the same authority. This conflict needs to be managed through checks built in the law itself.

Recommendations for the proposed data protection law

Based on the analysis present in this note, I would like to make the following tentative suggestions on the proposed data protection law:

  1. Protections and powers: Achieving informed consent should be the main focus of the law. To this end, certain individual rights need to be included. These include: right to seek confirmation, right to access the data, and right to rectify the data. However, these rights should be exercisable after paying reasonable fees, as they impose costs on data controllers/processors. The DPA should focus on building systems of regulation that ensure that the foundational requirement of informed consent is met in all circumstances, except where exemptions are given. This in itself is a difficult challenge in India's context. It would be great if the DPA is able to build capacity around solving this problem.

    The rights that should not be included at this stage are: Right to Object to Processing, Right to Object to processing for purpose of Direct Marketing, Right to not be subject to a decision based solely on automated processing, Right to Data Portability, Right to restrict processing, Right to be Forgotten. Among these, some could be given as objectives to the DPA with limited powers to nudge the processors towards better protection. On direct marketing and automated processing, the DPA may be given powers to work towards improving outcomes, so that some persons are not paying very high prices for these otherwise beneficial activities. Once the DPA gets this power, it may define certain thresholds above which it could intervene, but not use its coercive powers in situations below the threshold. This is a very different formulation from a formulation based on a general individual right.

    Similarly, "data minimisation","purpose limitation" and "storage limitation" should only be included as aspects of consent, and not included as general preventive measures to be enforced by the DPA. The DPA should focus on ensuring that if a user has given consent for certain data to be processed for a specific purpose, and has allowed storage for a certain period of time, the terms of this consent are actually being adhered to. Beyond this, the DPA should not have powers and responsibilities to make substantive judgments about these issues. That would be pushing the Authority into Quadrant IV. However, the DPA should be given the mandate to ensure minimum data security standards to avoid instances of breach.

    We should first get the basics right. Setting aside the debates about whether additional protections and preventive powers should ever be included in a data protection law, I am only suggesting not including them in the law in the first instance. In a few years, if the DPA is able to build capacity, and is able to deliver on the protections promised, additional protections may be debated, and introduced. Let us not forget that there is always a chance that it could become an ineffective, inefficient or even venal agency. Entrusting a new regulator with an expansive mandate on day one could be a recipe for failure.

  2. Tiered system: The law should create a three categories of "services and applications" based on their importance for an average person: tier I (necessary services, such as healthcare, education, financial services; plus, processing with positive benefits to the society), tier II (important but not necessary services, such as social media), and tier III (optional services, such as games). The law should mandate the DPA to put more resources into ensuring data protection for personal data shared for tier I, followed by tier II. The DPA should not focus on tier III usages, and users should make their own choices and face the consequences. Reasonable persons can disagree on what services should be in which tier, but this is not an argument against the need for a tiered system. Further, the DPA should be mandated to focus on protecting sensitive personal data, and this category should be given in the law.
  3. Jurisdiction: The jurisdiction should be limited to those entities that are processing in India. The DPA should not be given powers to "pursue" foreign entities to establish its jurisdiction over them, to bring them to process in India. Further, in sectors where regulators conducting regular supervision are already there, the responsibility for monitoring compliance and taking enforcement actions may be given to the respective regulators. Small organisations should be exempt from the law.
  4. Enable DPA to be the advisor/reviewer/auditor/board member for data protection in government organisations: The law should include an enabling provision for the government to appoint the DPA for advising government organisations on data protection policies and practices, reviewing their data protection policies and practices, and auditing implementation. DPA representatives could also serve on the board of organisations that handle a lot of personal data (eg. UIDAI). The DPA should, over time, develop into an organisation that can help the government take preventive measures to protect data, because ex-post measures are not likely to be effective with government organisations.
  5. Allow space for innovation, without compromising on the objective of the law: The law should empower the DPA to establish and oversee a regulatory sandbox to allow limited period trials of innovations that can be exempt from certain regulations. After these limited period pilots are documented, their experience may be used to modify the regulations.
  6. Board Composition: The DPA Board should have a majority of independent members, who may be experts, retired civil servants, consumer advocates, and others. The process of appointment as well as the grounds and process for removal of members should be laid down in the law. The Board should be required to make annual plans, and publish performance reports with annual reports every year.
  7. Due process requirements in the law: While making regulations, the DPA must publish draft regulations along with a statement on the legal authority to make the regulations, a statement of the problems to be solved, and an analysis of expected impact of the proposed regulation. After comments have been received, the DPA must be required to publish all the comments received, provide a reasoned response to the comments received, get the draft regulations formally approved by the board, and then publish the regulations. In case of emergency regulation-making, the requirements of consultation and analysis of regulatory impact may be relaxed, but such regulation should lapse after six months.

    The DPA will perform a variety of executive functions under this law. These include: inspections, investigation, and recommending penalties or compounding violations. When investigations are envisaged they should be carried out according to written terms of investigation; carried out by an appointed investigator; finished within a predetermined period, which may be extended by a quasi-judicial officer on a reasoned order; and carried out with least disruption to a business. Similarly for recommending penalties or compounding violations, the DPA should be guided by detailed regulations requiring the authority to show proportionality, and fairness. There must be a separate wing within DPA, which adjudicates violations. Members of such wing should not interact or report to persons carrying out or overseeing the investigation functions.

Conclusion: the importance of being pragmatic

It is interesting how a matter that was not even on the radar of policymakers has suddenly become an absolute necessity. Because of the heightened sensitivity around this issue, and the opportunity that this has created to get a law passed, it is tempting to demand a comprehensive law that envisages a wide range of protections and powers, as well as an expansive jurisdiction. This carpe diem temptation is a trap that must be avoided. In my view, we should take a pragmatic approach towards the law. We should consider what kind of a law will help produce actual data protection outcomes.

A proposed law should be judged on the basis of its expected practical consequences. This is because ultimately we care about outcomes and not just expression of good intent. In India, we have had many ambitious laws that did not lead to expected outcomes, and some have actually made us worse off. The implementation of a law depends on a variety of context-specific factors, such as regulatory capacity, resource availability, scale of a country, capacity of the adjudication system, and so on. So, the same law may have very different practical consequences in India, than it would have in, say, UK.

Pragmatism demands careful thinking about the nature of the problem and the context in which it is to be addressed. The law should be such that it ensures good outcomes in the long run, even if it disappoints some folks in the short run. Many of the features of a data protection law that have been provisionally endorsed by the Committee and/or are being demanded by many stakeholders have only recently found their way into the laws of developed countries. Simply including them in a law will not create protections, as outcomes of a regulatory law depend on the effectiveness of regulatory system. The focus should be on ensuring that the law is such the regulator is able to build the capacity to deliver on its mandate. Burdening a new data protection regulators with a broad mandate would likely set it up for failure.

We should begin with a law that gives a narrow mandate to the regulator, allow the regulator to build capacity to deliver on that mandate, and then expand its mandate. In my view, this narrowing of the mandate could entail: focusing specifically on achieving informed consent; giving limited preventive powers to the regulator on issues such as data security; limiting jurisdiction to entities processing in India, and not giving powers to demand jurisdiction on foreign entities; relying on sectoral regulators for monitoring and enforcement; applying the law only on entities that are above a threshold; and prioritising data protection for certain important purposes. In addition to narrowing the mandate, it is also important to mandate detailed checks and balances in the law, to minimise chances of abuse of powers.

The founding conditions cast a long shadow on the evolution of an organisation. If the law that establishes the data protection regime is rooted in careful consideration of the factors that are likely to shape its implementation, we will have a better chance of achieving good data protection outcomes.


Suyash Rai is a researcher at National Institute of Public Finance and Policy. The author would like to thank Renuka Sane, Anirudh Burman, Milan Vaishnav, and Ajay Shah for useful discussions. This essay is based on the comments the author has submitted to the Committee of Experts.