Search interesting materials

Friday, February 09, 2018

A Pragmatic Approach to Data Protection

by Suyash Rai.

A Committee constituted by the Central Government is working on a data protection law. The White Paper published by the Committee suggests that the Committee is in favor of a comprehensive law, with a number of rights and protections. Many of the stakeholders seem to be demanding an even more comprehensive law. In this essay, I present analysis of certain issues relating to regulation of data protection. These are organised around the themes of: regulatory capacity; the economics of data protection regulation; jurisdiction-related issues; the rights-based approach to regulation; the need to distinguish between data protection and broader privacy concerns; approach to data protection in government organisations; balancing regulatory clarity with flexibility to allow innovations; and regulatory governance and due process requirements. In my view, reflection on these issues will help create an effective law for data protection. Towards the end of this essay, certain specific suggestions on the legislative formulation are given. However, these suggestions, which flow from the analysis, are very tentative, because much more analysis is required.

Proposal for a data protection law

The Justice K. S. Puttaswamy (Retd.) and Anr. vs Union Of India And Ors. judgment on the right to privacy has brought privacy-related issues to the forefront of policymaking. The Order of the Court, inter alia, said, "The right to privacy is protected as an intrinsic part of the right to life and personal liberty under Article 21 and as a part of the freedoms guaranteed by Part III of the Constitution."

Among the privacy-related issues, a full-fledged policy process has been initiated on data protection. In the Puttaswamy Judgment, one of the opinions, which was signed by four of the nine judges, enjoined upon the Central Government to "examine and put into place a robust regime for data protection", which strikes "a careful and sensitive balance between individual interests and legitimate concerns of the state". Some of the other opinions also made references to the issues of informational privacy.

A Committee of Experts, chaired by Justice BN Srikrishna, was constituted while the hearings on the case were underway. The Committee has been asked to study "issues relating to data protection in India," to suggest "principles to be considered for data protection in India," and to suggest "a draft data protection bill" Following are some of the key points discussed in a White Paper published by the Committee.

  1. What is to be protected: Data protection is about protecting personal data. The Committee is deliberating what constitutes personal data. Its provisional view is that data from which an individual is directly or indirectly identified or reasonably identifiable is personal data, and this includes any information including opinions or assessments irrespective of their accuracy. The Committee is also considering whether and how to define a sub-category of "sensitive personal data", to be given enhanced protection. Its provisional view is that health information, genetic information, religious beliefs and affiliations, sexual orientation, racial and ethnic origin, caste information, financial information may be treated as sensitive personal data. Further, the White Paper says that, in other categories such as philosophical or political beliefs, an assessment may be made whether a person has an expectation of a high degree of privacy.
  2. Exemptions: The Committee has proposed wide exemptions for household purposes, journalistic/artistic purposes, and literary purposes. They have also proposed exemptions for: data processed for the purpose of academic research, statistics and historical purposes; information collected for investigation of a crime, and apprehension or prosecution of offenders; information collected for maintaining national security and public order. To ensure that exemptions are not misused, the Committee is considering safeguards, such as a review mechanism.
  3. Protections and Rights: The Committee's provisional view is that informed and meaningful consent for processing personal is central to protecting personal data. Other than clearly given exemptions, all other processing must be after consent. The Committee is considering what would constitute valid consent. The Committee has proposed certain preventive measures, such as data limitation (only processing the data required for a task), purpose limitation (data must be collected for a specified purpose and used only for that purpose), storage limitation (data to be erased after its purpose is met), and so on. The Committee is also considering certain individual rights. The first set of rights are: right to seek confirmation, right to access the data, and right to rectify the data. Given that such rights impose costs, the Committee has proposed allowing fees to be charged for them. The second set of rights are: right to data portability (data of an individual be made available in a universally machine readable format or ported to another service provider); right to not be subjected to a decision based solely on automated processing; right to object to processing for the purpose of direct marketing. The Committee also seems to endorse a right to be forgotten. The White Paper has a discussion on a general right to object to processing, but provisionally finds it unsuitable for India.
  4. Jurisdiction: The Committee has floated three alternatives: a) Cover cases where processing wholly or partly happens in India irrespective of the status of the entity; b) Regulate entities which offer goods or services in India even though they may not have a presence in India; c) Regulate entities that carry on business (i.e. consistent and regular activity with the aim of profit) in India. The Committee has also raised issues of scope relating to applicability of the law to data relating to juristic persons such as companies, differential application of the law to the private and the public sector, and retrospective application of the law.
  5. Differential obligations: The Committee is also considering greater obligations for entities that create more risks. The additional obligations on such entities may include: registration, data protection impact assessments, data audits; and a designated Data Protection Officer.
  6. Institutional mechanism: The Committee has proposed a Data Protection Authority (DPA) for implementation of the law. It would set standards, monitor compliance, and take enforcement actions. It would also work towards generating awareness. The Committee seems to be in favor of a co-regulation model involving close industry participation. The Committee's tentative view is that accountability should not only be enforced for breach of data protection obligations, but also, in certain circumstances, it could be extended to hold data controllers liable for the harms that they cause to individuals even without violation of any other obligation.

    For redress, the Committee has proposed a multi-tier system, wherein an individual would first approach the data controller, and if the data controller fails to resolve the complaint, the individual may file a complaint with the data protection authority. DPA may also initiate action against a data controller on a suo motu basis. The Appellate Tribunal under the IT Act may be the appellate forum for any decision of DPA. DPA may be given the power to impose civil penalties as well as order the defaulting party to pay compensation up to a threshold. Appeals against an order granting or rejecting such compensation, and compensation claims above the threshold may lie with the National Commission Disputes Redressal Commission.
  7. Principles: The Committee has endorsed seven principles to underpin the law: a) Technology agnosticism (flexibility to take into account changing technologies and standards of compliance); b) Holistic application (cover both private sector entities and government, with differential obligations for legitimate State aims); c) Informed consent (consent must be informed and meaningful); d) Data minimisation (data that is processed ought to be minimal and necessary for the purposes for which it is sought and other compatible purposes beneficial for the data subject); e) Controller accountability (the data controller to be held accountable for any processing of data); f) Structured enforcement (enforcement by a high-powered statutory authority with sufficient capacity); g) Deterrent penalties (penalties to ensure deterrence).

This suggests that a comprehensive regulatory regime, with a wide range of protections enforced by a powerful regulator, is in the offing. Going by the minutes of the consultations held by the Committee and some of the submissions to the Committee that are publicly available, the thrust of a plurality of stakeholder comments is to further expand the scope of the proposed law. More and more rights are being recommended. More preventive measures are being proposed. And more powers are being demanded for the proposed regulator.

It should be obvious that enacting a data protection law by itself will not ensure data protection. A regulatory law works through the regulatory system established by it. Actual data protection outcomes will depend on how effectively and efficiently the regulatory system regulates the market processes. I would go so far as to argue that for good outcomes, implementation matters more than what the law promises. In this journey from law to outcomes, regulatory capacity is the key.

The law and regulatory capacity

The proposed DPA will make regulations, monitor compliance, and take enforcement actions. On building regulatory capacity in DPA, I would like to make three India-specific points: first, this regulator will come up in the backdrop of relatively low regulatory capacity in India, compared to countries presently implementing advanced data protection laws; second, given the nature of data protection regulation, the regulator will find it very difficult to build capacity; third, the mismatch between the capacity and the mandate of the regulator can create poor outcomes, such that giving a broad mandate may produce worse outcomes than giving it a narrow mandate.

Relatively low regulatory capacity in India

The following chart shows percentile ranks (0 - lowest; 100 - highest) on "regulatory quality" for India and the countries whose laws the Committee has most frequently cited in the White Paper. This ranking is from the World Governance Indicators (WGI) published by the World Bank. For "regulatory quality" in India, data sources used were: Bertelsmann Transformation Index; Economist Intelligence Unit; Global Insight Business Conditions and Risk Indicators; Heritage Foundation Index of Economic Freedom; IFAD Rural Sector Performance Assessments; Institute for Management and Development World Competitiveness; Institutional Profiles Database; Political Risk Services International Country Risk Guide; World Economic Forum Global Competitiveness Report; World Justice Project.

Figure 1: Percentile rank on Regulatory Quality (Source: World Governance Indicators, World Bank)

India ranks much lower than the countries cited in the White Paper. Almost all these countries are close to the top rank (Singapore ranks number 1). Usually, on most indices of state capacity India ranks close to the median. Such rankings and indices are not precise, scientific measurements of capacity, but they are useful indicators of relative capacity. It is safe to say that regulatory capacity in India is much lower than that in other countries with advanced data protection laws. Why is this so? A variety of factors may determine State capacity in a country: organisation design and management; political system design; basis of legitimisation; and cultural and structural factors. Many of these factors are shaped by contingent social and political processes over long periods of time (see Francis Fukuyama's work on this: "State building: Governance and world order in the 21st century" and "The Origins of Political Order: From Prehuman Times to the French Revolution"). Further, many of the factors are not within the control of any one organisation.

The question, then, is: how should this fact of relatively low regulatory capacity inform the formulation of a data protection law? To answer this question, we need to first move from this general observation on regulatory capacity in India, to the specific nature of activities involved in data protection regulation, and the kind of capacity required to perform them. This may help us understand the specific challenges of building capacity in the proposed DPA.

Challenges of building regulatory capacity for data protection

Lant Pritchett and Michael Woolcock, in their paper "Solutions when the Solution is the Problem: Arraying the Disarray in Development", provide a framework to understand the challenges of capacity building. They analyse activities in terms of how discretionary (i.e. to what extent decisions will be made on the basis of information that is important but inherently imperfectly specified and incomplete) and transaction-intensive (i.e. the number of decisions required) they are. They find that it is most difficult to build real capacity for activities that are highly discretionary and transaction-intensive.

Many activities involved in Data Protection Regulation score high on both: they will be highly discretionary and transaction-intensive. The level of discretion may vary from one activity to another. Here are a few examples:

  • Data breach is a kind of problem that may require relatively less discretion to identify when it occurs, as there is a limited space for disagreement on whether there was a breach. However, the same problem can require more discretion if "preventive" actions are to be specified to avoid breaches. Experts can have wide disagreements on the best ways to manage the risk of data breach in different contexts.
  • Regulating to ensure informed consent can require a considerable amount of discretion, because, to be implemented effectively, it will require complex assessments about whether the consent was truly informed and meaningful.
  • Data minimisation is ensuring that no more data should be processed than is required for a task. This may require complex assessments to be made about the data required for a given task. Even if the regulator relies on self-assessments or third party audits, these assessments and audits will still need to be evaluated in a variety of contexts. Such judgments require sophistication and knowledge. A wealth manager, for instance, often collects and processes a large amount of personal data. There will always be considerable discretion in assessing whether data minimisation is being achieved.

Transaction intensity in data protection regulation arises out of its monitoring and enforcement functions, which will require directly or indirectly monitoring numerous events in a larger number of data controllers and processors across a number of sectors, and taking decisions about them. The transaction intensity is also shaped by a unique type of moral hazard problem that is seen in this domain. This problem, which is discussed in detail later in this note, arises out of the fact that "personal data" is not a finite resource to be protected. Users can, by sharing data and creating more personal data by online activities, change the scale of the problem for the data protection regulator.

The combination of these two characteristics (highly discretionary and transaction-intensive) makes it more difficult to build capacity, because it is not about appointing a few capable individuals exercising discretion (eg. Monetary policy) or about managing a large number of persons performing mechanised tasks (eg. Aadhaar enrolment; immunisation). Discretion is easy to abuse, and it also means that mistakes are not immediately seen as mistakes. Transaction-intensity poses the challenges of achieving good performance in a larger number and variety of situations.

DPA will have to evolve the organisation form suited for performing these functions in India’s context. For instance, DPA may choose to devolve many activities to self-regulatory organisations or recognised aggregators, leaving to itself only some of the activities, such as standard-setting, exception-handling, etc. But such strategies only change the type of capacity needed to be built. For instance, regulating aggregators is a different kind of challenge. The substantive responsibilities will remain with the DPA. So, it is advisable to be modest about expected capacity in the DPA during the initial years.

The mandate given to the authority may affect its ability to build capacity

In the initial years, the DPA will have low capacity. It is important to avoid mistakes that impede the process of building real capacity over time. The most common mistake is to give a regulator a broad mandate (a combination of expansive jurisdiction and a large number of varied responsibilites) and draconian powers in its early days, when its capacity is low. The possibilities can be depicted in the following matrix.

The Capacity-Mandate Matrix

Narrow MandateBroad Mandate
High CapacityQuadrant IIQuadrant I
Low CapacityQuadrant IIIQuadrant IV

Certain clarifications regarding the matrix are worth stating. First, although only four possibilities are shown, it is obvious that there is a continuum along both variables. Second, capacity is not a static phenomenon - some organisations perform well under stress, while others perform well during normal circumstances but collapse in situations of stress. Third, different types of capacities are required for different kinds of functions and responsibilities. The limited objective of the matrix is to highlight the choice to be made with regard to the initial mandate given to the DPA.

To produce good outcomes, there needs to be some correspondence between capacity (the type of capacity and its performance under stress) and mandate (jurisdiction and responsibilities, and the possibilities of stress). Since a new regulator will have low capacity during the initial years, the choice that the Committee has to make in its recommendation is between Quadrants III and IV. Beginning in Quadrant IV (low capacity and broad mandate) may lead to implementation failures:

  • Capacity collapse under stress: Government agencies differ from private firms in a number of ways. They do not have profit as a key indicator of performance, and must develop complex ways of measuring success and holding the staff accountable. They are usually not able to raise and allocate financial resources freely. They are not able to hire and fire easily. They are not able to procure goods and services without going through complicated processes. They need to be responsive to demands and interests of a variety of stakeholders in the society. They must constantly build and maintain political legitimacy, or they may be rendered irrelevant.

    In the context of these constraints, if a regulator begins in Quadrant IV, the huge mismatch between the mandate and the capacity, the overly optimistic expectations of the pace of improvements in outcomes, and unrealistic expectations about improvement of capacity would lead to stresses and demands on systems that will affect capacity-building in the regulator (for a discussion on failures due to "premature load bearing", see: Pritchett, Lant, Michael Woolcock, and Matthew Andrews. "Capability traps? The mechanisms of persistent implementation failure." (2010)). It is difficult enough to build capacity to deliver on a narrow mandate. With a broad mandate from day one, the regulator may never get a chance to carefully build capacity to perform its functions. It may always remain in coping mode, in face of expectations it cannot really fulfill. This may open the space for two pathways of implementation failure: preferring form over function and/or misuse of powers.

  • Preference for form over function: To maintain legitimacy, the regulator may simply imitate the forms of modern institutions without actual functionality. Regulators, like any government institution in a political society, need to gain and maintain legitimacy in the society. In face of expectations that are impossible to meet, a regulatory organisation may "mimic" forms of organisation and procedures, without functionally performing its role and producing the desired outcomes. (for a discussion on how institutions in contexts of high expectations and low capacity often choose to neglect actual performance of functions, and focus on mimicking forms of well-performing institutions, see: Pritchett, Lant, Michael Woolcock, and Matthew Andrews. "Capability traps? The mechanisms of persistent implementation failure." (2010)) This is a natural response when legitimacy is to achieved in a context of low capacity, great expectations and conflicting interests. The alternative is to achieve legitimacy through actual performance, but this is very difficult if the mandate is broad. So, the staff of the regulator may respond by following rules and procedures but not truly concern themselves with the outcomes. This does not yield actual outcomes. At best, it only creates a perception of performance.

  • Misuse of powers: a regulator with a broad mandate is usually also given draconian powers. As the organisation starts deriving more of its legitimacy by form and posturing, rather than by actual performance in delivering outcomes, this decline in integrity may also lead to inefficient and/or unfair use of powers. For instance, when faced with violations, it may be tempted to deploy a heavy-handed approach, using outright bans and disproportionate penalties, just to get political legitimacy. To some extent, this problem can be overcome by placing due process requirements on the regulatory authority (discussed later). However, in situations of capacity collapse and decline in integrity, these checks and balances may have limited efficacy. It is, after all, difficult to hold an organisation accountable to do the impossible.

    The risks that may emerge from a DPA misusing its powers are enormous. DPA would have considerable powers to intervene in private transactions. It could construct large-scale data surveillance mechanisms in the name of monitoring compliance with regulations. Such an Authority, if it starts abusing its powers, can do a lot of damage. Its employees will potentially intrude into many many kinds of transactions to try and decide questions of consent and standards of conduct. In doing so, they will access personal data to an extent that no other regulator currently does. This can make the dream of data protection go sour.

A regulatory organisation beginning in Quadrant IV risks being stuck in low capacity. Worse, it may lose integrity, and end up focusing more on appearance than on performance, preferring form over function. Worse still, it could start misusing its powers. So, moving from Quadrant IV to Quadrant I would be difficult. Further, even if the political leadership sees the problems and seeks to map expectations to actual capacity, moving from Quadrant IV to Quadrant III is not politically feasible, given the politics of reducing protections, especially in face of fierce activism that surrounds such issues. It would, therefore, be a mistake to place a new regulatory agency in Quadrant IV, i.e. hobble it with a broad mandate when it has little capacity. This will almost certainly produce poor outcomes.

It would be better if the DPA begins in Quadrant III (with a clear and narrow mandate), moves to Quadrant II by building capacity to deliver on its narrow mandate, and then, over time, moves to Quadrant I. As the regulatory system demonstrates ability to solve problems, its mandate may be broadened. We must resist the temptations of Quadrant IV. The law should be closer to Quadrant III, and lay the foundation for an effective regulatory regime for data protection. This raises the question: what is a "narrow" regulatory mandate? This is a difficult question to answer, but one that must be answered. Some of the analysis in this note may help identify the basis for narrowing the mandate, but much more work and discussion is required to come up with a suitable Quadrant III formulation.

The Economics of Data Protection Regulation

Economic analysis can inform the design of the data protection law by pointing at: how incentives may be shaped by the law; how the economics of purpose and risk may help prioritise allocation of regulatory resources; and how mandating economic analysis may help avoid wrong regulatory choices.

A unique moral hazard problem

What is to be protected under a data protection regime is "personal data". This data is to be protected from breaches, unapproved processing, etc. However, unlike, say, money, there isn't a finite amount of personal data to be protected. Users can share the same personal data with many data controllers. Users can also create more personal data by online activities. Each instance of sharing or creating personal data adds to the risks of data protection for the user, and thereby to the scale of the problem for the data protection regime. This ability of the users to significantly expand the very field of regulation makes data protection a unique regulatory challenge. Therefore, prudence exercised by users in sharing and creating personal data is critical for data protection, much more so than it is in any other field of regulation. The law should not give the users incentive to be imprudent, especially in decisions that they are well-placed to take.

The data protection regime could shape the behaviour of users. If the regulatory framework puts greater responsibility on the regulator to assure protections by taking preventive measures and to give quick redress based on individual grievances, users would have less incentive to be prudent while sharing and creating personal data. This a moral hazard problem - just because someone else is giving protection against the risks, one is likely to take more risks. On the other hand, if the regulatory approach is sharply based on user responsibility and consent, and lets users incur costs of their imprudence, we can expect more prudence from users.

Take the example of data minimisation. One construct could be to have consent-based data minimisation, wherein it is the responsibility of the user to determine whether data minimisation is being achieved at the time consent is sought. Such regulation would focus on ensuring that the users get the necessary information about the data to be processed for a given task, and the monitoring by regulator to ensure that processing is consistent with the consent. Another approach to data minimisation could be to empower the regulator to assess whether processors are processing more data than is required for a task, irrespective of whether consent has been given for such data to be processed. The latter construct would intensify the problem of moral hazard.

However, in certain areas, preventive measures by regulator are required. For instance, preventive measures may be required to maintain minimum standards of data security, because users will typically not be in a position to assess this at all, and harms caused by a breach may be significant. Such preventive powers should be given only where they are necessary.

One could argue that moral hazard is not unique to data protection regulation. In banking regulation, for instance, the State promises to make efforts to keep banks reasonably safe, and takes preventive measures to keep this promise. This gives the depositors a certain level of comfort, which makes them less likely to be careful while choosing the bank to put their money in. However, this effect works within a limited, defined space of banking, which is comprised exclusively of licensed banks. The regulator controls entry into and exit from that space. Contrast this with, for instance, mobile applications - the moral hazard would encourage behavior that will expand the scale of the problem in a manner that cannot be controlled by the regulator. In data-based applications (online or real world), it is infeasible to ensure an exclusive, licensed field of protected activities. So, users would assume that the regulator will protect them, and this may lead them to be more indiscriminate in sharing and creating personal data.

It might be tempting to point at problems of achieving informed consent and to advocate regulator-led measures of data protection that limit the role of consent and focus more on ex-ante, preventive measures monitored and enforced by the regulator, but this is a road to less prudence by users and ever-increasing responsibilities and powers of the regulator. Acknowledgment of this interplay between prudence of users and the responsibilites of the data protection regime should inform the nature, scope and extent of protections promised by the law.

One could argue that protections ensured by the regulator allow us to participate more freely, and not giving extensive protections may create a chilling effect, but there can also be a good kind of chilling effect, which makes us careful about sharing and creating personal data. Focus on user responsibility is essential to achieve the good chilling effect, and avoiding the bad chilling effect.

Purpose vs. Risk

There can be disagreements on the specifics, but it should be easy to see that all purposes that require processing of personal data are not equally important. For example, it can be argued that certain recreational applications such as mobile games are not as important as healthcare services. If we acknowledge such distinctions, we could argue that pragmatism demands that regulatory emphasis be given to providing greater protections for personal data in more important services, where there is relatively less user discretion. This also ties in with the importance of user responsibility. Users who freely share data with applications that are generally considered to be less important (eg. games that require a lot of personal data) should deal with the consequences of their choices. It is not a good use of limited regulatory capacity to ensure data protection in such situations. Similarly, greater emphasis may be required for sensitive personal data, as has been discussed in the White Paper.

Another distinction that can be useful is that between processing of personal data for personal benefits, and processing that is beneficial for the society. Personal data is, in most instances, a private good, and the person whose data is protected gets most of the benefits of the protection. In some instances, however, there are positive externalities of data sharing: a person sharing data that benefits others (eg. sharing data about blood group and contact details). On the margins, regulatory resources may be better used in protecting personal data with large externalities. Economic theory suggests that consent for such processing will be in under-supply. By augmenting protections, such activities can be encouraged.

Market failures, real problems, and effective regulation

The primary reason for regulatory intervention in markets is to address problems created by market failures. Market failures relevant for data protection are: market power (a controller/processor enjoys dominant market power that it can abuse), asymmetric information (user does not have information required to take the right decision), externalities (costs of mistakes by a controller/process are inflicted upon the users). The problems relating of abuse of dominant market position are usually addressed in competition laws, and should ideally not be included in a data protection law.

Market failures only create potentiality of harm. Often, there is no incentive for the controller/processor to take advantage of market failures, because other incentives are stronger. For instance, the market may reward more privacy-friendly providers, leading them to voluntarily protect data of users. So, any regulation must be in response to a clearly identified and significant problem arising out of a market failure. Some of the protections being envisaged do not appear to be based on such significant problems that they justify creating a general right in a law. In my view, these are: being subject to a decision based solely on automated processing; right to object to processing for direct marketing; and the right to be forgotten.

The problems of automated processing discussed in the White Paper seem to arise out of genuine mistakes, and may be resolved without creating a general right to not be subjected to automated decisions. There seems to be no intent to cause harm. The right to be forgotten is not a response to a market failure, but is coming from an extreme interpretation of privacy, which, as the White Paper discusses, allows costs to be imposed on the society so that a person can be forgotten. There can be other grounds to support such a right, but the case is weak on economic grounds. On direct marketing, there is a market failure in the form of negative externality imposed on those not seeking the good or service being sold. However, it is not clear that the problem is so grave that the State's coercive powers are required to uphold a general right against it.

Finally, even if there are significant existing or emerging problems due to market failures, it is important to demonstrate that the proposed interventions will be effective in addressing the problems. This calls for analysis of regulatory impact before regulations are made, and analysis conducted periodically to measure continued effectiveness. Before making a regulation, such analysis usually includes projections for several years into the future. This can help focus regulatory resources on significant problems that are already there or are likely to arise. Giving a general right in the law presumes such analysis has been conducted for all the problems that the right is supposed to be addressing. Perhaps some of the protections need not be formulated as "rights".

On the rights-based approach to data protection

Following the example of other countries, the Committee seems to have used the rights-based language for most of the protections it seeks to recommend in the law. This issue requires a careful rethink, because this has consequences for the way the regulatory system will evolve.

One way to think about this is to distinguish between protections that are required for the market processes to function well, and the protections that are outcomes of the market processes. Informed consent is a precondition for the market to produce good outcomes, because such consent is necessary as an input to the market processes about what the consumers want. Informed consent signals what the consumers see as useful trade-off between protecting their privacy and using their data productively. On the other hand, for instance, the extent to which a person is subject to a wrong decision solely based on automated processing is an outcome of the market processes.

The word "right" gives a sense that each individual can invoke the State's coercive powers to claim what is being called a right, without regard to the costs, and irrespective of the scale of the problem. In data protection, the right to informed consent is perhaps the only such right, and it can be said to entail a few participation rights, such as confirmation, access, and rectification, which are required to give effect to a proper right to informed consent. Even the basic participation rights may be exercisable only at a cost, and therefore fees should be allowed for confirmation, access and rectification. These fees should be regulated, so that they are not prohibitive.

The remaining protections, if any, may be given to the regulator as objectives to be achieved at aggregate level, but not given as rights to individual users. So, for such protections the focus of the regulator would be on achieving good outcomes in the aggregate, rather than upholding exercise of these rights by individuals. For instance, in a rights-based framework once a person exercises a right to object to a decision based solely on automated processing, the regulator would be required to ensure that this is done in all instances of exercise of this right. Similarly, for issues like direct marketing, the regulatory capacity would be misapplied in trying to secure exercise of rights by a number of individuals. Instead, it would be better for the regulator to specify regulations that would reduce the instances of excessive harms caused to users by automated processing or digital marketing. Individual rights-based approach should only be used for the basic rights required to give each user a reasonable control over her personal data.

Take the example of direct marketing. Firms conduct direct marketing because it connects them to persons who become their consumers, which also means that many consumers gain from the process. So, the society on the whole is better off because of direct marketing. The problem, however, is that an externality is being imposed on those who receive calls they are not interested in. Since many consumers value being let alone sometimes, there are market-based solutions to this problem. There are call filters (eg. Trucaller), email filters, etc., which are available for such consumers. Consumers can minimise the problem by blocking calls and unsubscribing from emails and message from particular sources by putting in a little effort. Framing this issue as an "individual right", and bringing in the State's "monopoly of coercion" into this situation may be excessive, and would discourage users to solve this problem by market-based solutions. If a generic right for this purpose is created, it would be included in the redress and enforcement mechanisms, and these mechanisms may be burdened by what is essentially a small problem. Doing so would also favor those who are better placed to pursue the redress and adjudication route. Instead, the regulator may be given an objective to improve the system of processing for direct marketing, so that situations where excessive costs are imposed on certain individuals are minimised. This can be done by setting standards for processing for direct marketing, which help minimise the "mismatch" problem in the aggregate.

Jurisdiction-related issues

Jurisdiction issues could be territorial, sectoral or based on type or size of organisations to be regulated.

Territorial jurisdiction issues

The online world is truly global. Most of the applications that we Indians use are hosted abroad, and offered by organisations with limited or no physical presence in India. For instance, Facebook does not have a data centre in India, and most of its software development is also done abroad. This poses difficulties for monitoring and enforcement by the proposed DPA. Establishing actual jurisdiction for the purposes of regulation and supervision requires having an identifier for the organisation (eg. registration), a line of communication with the organisation, being able to inspect the databases and softwares, and having an entity on whom penalties and other enforcement orders can be served. While this is relatively easy to achieve for organisations where processing already happens in India, it is difficult and expensive to establish jurisdiction over organisations that conduct processing abroad. The costs of establishing jurisdiction may vary depending on the type of entity. The question is: why would anyone agree to be regulated by a DPA in India? Whether a foreign organisation providing an online service will submit to regulations in India will depend on the disincentive of not doing so.

In finance also, for instance, there is a jurisdiction problem. It is potentially easy to get a financial service from a service provider abroad. It has been considered important to establish jurisdiction over any firm offering financial services for consumers in India. So, across sectors, there are prohibitions on offering financial services without authorisation from a regulator in India. In 2013, when recommending wide-ranging financial sector reforms, the Financial Sector Legislative Reforms Commission had also recommended that no person should be allowed to offer financial services in India without authorisation by a regulator.

China seems to have taken a similar approach for the internet, and ended up creating a parallel internet, wherein a large number of websites and applications are banned simply because they do not play by the rules made by the country. One could argue that this is reasonable, as each country has the right to define what kind of internet access its citizens should have. However, the costs of exercising this right are considerable, as this may lead to a large number of bans, and cut India off from larger parts of global flow of online services. So, if we want to establish jurisdiction over foreign firms collecting data from Indians, it would require creating a strong disincentive, such as a ban, for the controller/processor that does not give jurisdiction to the DPA. Even if we limit this to, say, "important" or "sensitive" personal data, it can create problems. For example, many patients from India send their medical information for second opinions from medical establishments abroad. This is usually done through some hospital in India. If the DPA insists that each such foreign establishment must register with it or such data cannot be shared, this would deny an important service to the patients.

In my view, it would be better to begin with regulating entities that are already processing data in India. This itself will need considerable discretion to be exercised, as has been seen in controversies around “permanent establishment” in tax cases. At the margins, there will be differences of opinion about when an entity can be said to be based in India. However, giving a regulator powers to take draconian measures to actively establish jurisdiction over entities based overseas may lead to excessive bans, especially when the regulator has low capacity, because capacity is required to determine suitable regulatory strategies for establishing jurisdictions by other means.

Sectoral jurisdiction

As the data protection authority will pursue its objectives across all sectors, this can raise conflicts with regulators. For example, in banking, securities markets, payments, etc, the data security issues are regulated by the respective regulators, because this is essential to these services. For instance, the RBI has recently established a subsidiary to work on data security issues. These services are largely operated through online systems, and a large part of prudential regulation is about ensuring security of these systems. If a payment system is breached, it would have direct financial consequence. The personal data in this case is mainly the financial data. When it comes to enforcement actions, it would be diffiult to disentangle data protection concerns from sectoral concerns. For instance, should the DPA be given the power to ban an RBI-licensed payment service provider, because of data protection concerns? Will DPA be in a position to consider the wider ramifications of such an action? Similar concerns can be raised for other sectors as well.Another issue in this context is that, even though the present data protections in those sectors are probably inadequate, there is existing regulatory capacity in some of the sectors.

Perhaps, a solution is to require the DPA to make regulations/standards in consultations with respective regulators, and once the regulations/standards have been specified, the sectoral regulators could supervise and enforce the law and the regulations. The respective regulators could do so in the course of their routine supervision of their sectors. I think this could be done for: financial firms, telecom service providers, internet service providers, etc. This may not appear to be a "clean" solution, but such aesthetic concerns should be weighed against the benefit of freeing up capacity at the DPA to focus on other sectors, and avoiding unnecessary conflicts. Also, since these regulators are in any case supervising their sectors, the additional capacity required to monitor and enforce data protection standards would probably be less than building the capacity for these sectors in the DPA.

Jurisdiction over small organisations

Given the scale of our country, it would be impractical to seek implementation of this law in every retail store and small firm. This is not to say that there are no data protection risks arising from small enterprises. But to begin with, the system should focus on achieving good outcomes with larger organisations. Small organisations should be exempt from the law. Largeness here should ideally be in terms of the amount of personal data controlled or processed, but proxy indicators, such as number of consumers, may be used to define a threshold.

Need to distinguish between data protection and broader privacy concerns:

The Committee is mandated to "study various issues relating to data protection in India". In my view, issues such as data portability, protection against being subject to a decision based solely on automated processing, and the right to be forgotten are not strictly data protection issues. They are data-related issues, but they have little to do with protection of personal data.

  • Data portability is not necessary for data protection, even though it may be good for the users to be able to shift from one controller/processor to another. This is a competition issue, as lack of portability hampers competition in a market. Arguably, denial of portability at a reasonable charge is an example of anti-competitive behavior. Further, the cost to the economy of securing a general "right" to data portability may be enormous, and a careful analysis of costs and benefits is required.
  • Being subject to a decision based solely on automated processing can sometimes become a problem, if it leads to a wrong decision. However, this is not related to protection of personal data. Automated processing can have benefits as well as costs. The examples given in the White Paper (person wrongly identified as IRA leader; loss of jobs, car licenses or voting rights because of wrong identification) are of situations where the automated processing led to a mistake. In such situations, there is no incentive for the processor to penalise the person. Since these are mistakes, is State intervention by creating a general right really required? In any case, this has little to do with data protection, and if it is being considered, this is the kind of protection that must be subjected to cost-benefit analysis.
  • The right to be forgotten: The White Paper seems to suggest that this right was endorsed by the Puttaswamy judgment. In the judgment, only one opinion discussed this right, and it cannot be reasonably considered to be the majority's opinion on the matter. EU has come to this big shift in the conceptualisation of the relationship between a person and society after a long process. Even in EU, the the right to be forgotten was replaced by a more limited right to erasure in the version of the GDPR adopted by the European Parliament in March 2014. We in India should not rush into such conceptions of privacy. In any case, this is not strictly a data protection issue.

Approach to data protection in government organisations:

In a way, data protection in government organisations is more important than in private organisations, because a lot of the personal data that government organisations have was obtained under the implied threat of the coercive power of the State. However, experience from other sectors (eg. banking) suggests that enforcement measures that are usually effective on private entities become less effective on government organisations. The penalties that are used by regulators to coerce the regulated entities to follow the regulations work less effectively with government organisations. Monetary penalties ultimately impose a loss on the taxpayers. Criminal cases are often difficult to initiate against civil servants, and in India, because of the way jurisprudence has developed, a larger number of persons working in government organisations are considered to be civil servants.

In principle, neutral application of law to both private and public sector is good, and this should be a principle underpinning the proposed data protection law also. However, there is also a need to think about other ways of ensuring data protection in the context of government organisations. Once the DPA is established and it builds capacity, it could become an advisor and reviewer of data protection policies in government organisations, so that its expertise is used to prevent mistakes from being made. It could also serve on the boards of government organisations processing a lot of personal data (eg. UIDAI). The law should contain an enabling provision to allow government to appoint the DPA to periodically review the data protection-related policies of government organisations, and have audits of their implementation conducted under DPA's supervision.

Regulations, Flexibility and Innovation

Regulatory systems work well when there are clear regulations that need to be followed, and employees of the regulator, the regulated entities, and the consumers have clarity about them. It is good to have clarity and certainty in regulations. However, this rules-based system comes at the cost of less flexibility. Once a regulator specifies a regulation, there can be little room for innovation that violates the regulation in word, even if it follows it in spirit. This is a perennial tension, but in data protection regulation, there is probably a deeper tension.

At the heart of a consent-driven data protection system is a trade-off between valuing one's privacy and valuing beneficial uses of one's personal data. Technology has multiplied the ways in which a person can use her personal data for deriving economic and social benefits. The use, of course, needs to be based on consent of the user. When a user is giving consent, she is supposedly making some calculation about how she may benefit from that consent. However, often, it is not obvious beforehand what kinds and scale of benefits can be gained by sharing certain kind of data. The users may be able to make a better choice if they see examples and demonstrations. However, a robust data protection regime may limit possibilities of innovation without explicit consent. So, there can be a logjam - users may not give consent without seeing demonstration of benefits, and processors may not be able to innovate without access to a critical mass of data. The logjam is for a good reason - both data protection and innovation matter. This is just one example, and there can be many situations where regulation may restrict innovation that could have led to better solutions for both data protection and beneficial use. For instance, what kind of a notice and consent process will work is an issue over which innovative solutions can be found.

One way to overcome such problems is to create a space within the regulatory system to allow limited scale innovations, where some regulatory exemptions are given. This "regulatory sandbox" needs to be provided in the law itself. Typically, a regulatory sandbox involves giving the regulator the power to oversee a closely supervised cohort of innovations for which certain regulatory exemptions are given. Once their lessons are documented, they may lead to modifications in regulations to allow innovations. This is a participatory approach where regulator and private participants work closely to help innovation happen. However, for this to happen, the law needs to empower the regulator to create these "safe spaces for innovation that achieve the objective of data protection while enhancing productive uses of data.

Need for sound regulatory governance and due process to be required by law

As a regulator, there are three types of actions that the DPA will take: drafting of regulations/standards; executive functions of inspection, investigation, and recommending penalties or compounding violations; and the quasi-judicial function of adjudication of disputes. Regulators are mini-States that perform all three functions. This creates potential for abuse of powers. The law should provide checks and balances to ensure that these powers are used properly. This requires two types of provisions: regulatory governance of DPA, and due process to be followed by DPA.

The law should provide for a good design of the Board of the DPA. The law should also give the processes and rationale for appointing or removing board members. This is important to maintain independence of the DPA. For its independence, it is also important that the funding process for the DPA is given in the law. Further, for accountability, it is important that the DPA be mandated to make annual plans, and publish annual reports that include performs on the previous years’ plan. Each type of regulatory action should be taken only after following due process, which should be laid down in the law. Independent authorities, such as the proposed DPA, have the power to be a judge in their own cases, i.e. they have their own officers adjudicating violations which have been investigated by the officers of the same authority. This conflict needs to be managed through checks built in the law itself.

Recommendations for the proposed data protection law

Based on the analysis present in this note, I would like to make the following tentative suggestions on the proposed data protection law:

  1. Protections and powers: Achieving informed consent should be the main focus of the law. To this end, certain individual rights need to be included. These include: right to seek confirmation, right to access the data, and right to rectify the data. However, these rights should be exercisable after paying reasonable fees, as they impose costs on data controllers/processors. The DPA should focus on building systems of regulation that ensure that the foundational requirement of informed consent is met in all circumstances, except where exemptions are given. This in itself is a difficult challenge in India's context. It would be great if the DPA is able to build capacity around solving this problem.

    The rights that should not be included at this stage are: Right to Object to Processing, Right to Object to processing for purpose of Direct Marketing, Right to not be subject to a decision based solely on automated processing, Right to Data Portability, Right to restrict processing, Right to be Forgotten. Among these, some could be given as objectives to the DPA with limited powers to nudge the processors towards better protection. On direct marketing and automated processing, the DPA may be given powers to work towards improving outcomes, so that some persons are not paying very high prices for these otherwise beneficial activities. Once the DPA gets this power, it may define certain thresholds above which it could intervene, but not use its coercive powers in situations below the threshold. This is a very different formulation from a formulation based on a general individual right.

    Similarly, "data minimisation","purpose limitation" and "storage limitation" should only be included as aspects of consent, and not included as general preventive measures to be enforced by the DPA. The DPA should focus on ensuring that if a user has given consent for certain data to be processed for a specific purpose, and has allowed storage for a certain period of time, the terms of this consent are actually being adhered to. Beyond this, the DPA should not have powers and responsibilities to make substantive judgments about these issues. That would be pushing the Authority into Quadrant IV. However, the DPA should be given the mandate to ensure minimum data security standards to avoid instances of breach.

    We should first get the basics right. Setting aside the debates about whether additional protections and preventive powers should ever be included in a data protection law, I am only suggesting not including them in the law in the first instance. In a few years, if the DPA is able to build capacity, and is able to deliver on the protections promised, additional protections may be debated, and introduced. Let us not forget that there is always a chance that it could become an ineffective, inefficient or even venal agency. Entrusting a new regulator with an expansive mandate on day one could be a recipe for failure.

  2. Tiered system: The law should create a three categories of "services and applications" based on their importance for an average person: tier I (necessary services, such as healthcare, education, financial services; plus, processing with positive benefits to the society), tier II (important but not necessary services, such as social media), and tier III (optional services, such as games). The law should mandate the DPA to put more resources into ensuring data protection for personal data shared for tier I, followed by tier II. The DPA should not focus on tier III usages, and users should make their own choices and face the consequences. Reasonable persons can disagree on what services should be in which tier, but this is not an argument against the need for a tiered system. Further, the DPA should be mandated to focus on protecting sensitive personal data, and this category should be given in the law.
  3. Jurisdiction: The jurisdiction should be limited to those entities that are processing in India. The DPA should not be given powers to "pursue" foreign entities to establish its jurisdiction over them, to bring them to process in India. Further, in sectors where regulators conducting regular supervision are already there, the responsibility for monitoring compliance and taking enforcement actions may be given to the respective regulators. Small organisations should be exempt from the law.
  4. Enable DPA to be the advisor/reviewer/auditor/board member for data protection in government organisations: The law should include an enabling provision for the government to appoint the DPA for advising government organisations on data protection policies and practices, reviewing their data protection policies and practices, and auditing implementation. DPA representatives could also serve on the board of organisations that handle a lot of personal data (eg. UIDAI). The DPA should, over time, develop into an organisation that can help the government take preventive measures to protect data, because ex-post measures are not likely to be effective with government organisations.
  5. Allow space for innovation, without compromising on the objective of the law: The law should empower the DPA to establish and oversee a regulatory sandbox to allow limited period trials of innovations that can be exempt from certain regulations. After these limited period pilots are documented, their experience may be used to modify the regulations.
  6. Board Composition: The DPA Board should have a majority of independent members, who may be experts, retired civil servants, consumer advocates, and others. The process of appointment as well as the grounds and process for removal of members should be laid down in the law. The Board should be required to make annual plans, and publish performance reports with annual reports every year.
  7. Due process requirements in the law: While making regulations, the DPA must publish draft regulations along with a statement on the legal authority to make the regulations, a statement of the problems to be solved, and an analysis of expected impact of the proposed regulation. After comments have been received, the DPA must be required to publish all the comments received, provide a reasoned response to the comments received, get the draft regulations formally approved by the board, and then publish the regulations. In case of emergency regulation-making, the requirements of consultation and analysis of regulatory impact may be relaxed, but such regulation should lapse after six months.

    The DPA will perform a variety of executive functions under this law. These include: inspections, investigation, and recommending penalties or compounding violations. When investigations are envisaged they should be carried out according to written terms of investigation; carried out by an appointed investigator; finished within a predetermined period, which may be extended by a quasi-judicial officer on a reasoned order; and carried out with least disruption to a business. Similarly for recommending penalties or compounding violations, the DPA should be guided by detailed regulations requiring the authority to show proportionality, and fairness. There must be a separate wing within DPA, which adjudicates violations. Members of such wing should not interact or report to persons carrying out or overseeing the investigation functions.

Conclusion: the importance of being pragmatic

It is interesting how a matter that was not even on the radar of policymakers has suddenly become an absolute necessity. Because of the heightened sensitivity around this issue, and the opportunity that this has created to get a law passed, it is tempting to demand a comprehensive law that envisages a wide range of protections and powers, as well as an expansive jurisdiction. This carpe diem temptation is a trap that must be avoided. In my view, we should take a pragmatic approach towards the law. We should consider what kind of a law will help produce actual data protection outcomes.

A proposed law should be judged on the basis of its expected practical consequences. This is because ultimately we care about outcomes and not just expression of good intent. In India, we have had many ambitious laws that did not lead to expected outcomes, and some have actually made us worse off. The implementation of a law depends on a variety of context-specific factors, such as regulatory capacity, resource availability, scale of a country, capacity of the adjudication system, and so on. So, the same law may have very different practical consequences in India, than it would have in, say, UK.

Pragmatism demands careful thinking about the nature of the problem and the context in which it is to be addressed. The law should be such that it ensures good outcomes in the long run, even if it disappoints some folks in the short run. Many of the features of a data protection law that have been provisionally endorsed by the Committee and/or are being demanded by many stakeholders have only recently found their way into the laws of developed countries. Simply including them in a law will not create protections, as outcomes of a regulatory law depend on the effectiveness of regulatory system. The focus should be on ensuring that the law is such the regulator is able to build the capacity to deliver on its mandate. Burdening a new data protection regulators with a broad mandate would likely set it up for failure.

We should begin with a law that gives a narrow mandate to the regulator, allow the regulator to build capacity to deliver on that mandate, and then expand its mandate. In my view, this narrowing of the mandate could entail: focusing specifically on achieving informed consent; giving limited preventive powers to the regulator on issues such as data security; limiting jurisdiction to entities processing in India, and not giving powers to demand jurisdiction on foreign entities; relying on sectoral regulators for monitoring and enforcement; applying the law only on entities that are above a threshold; and prioritising data protection for certain important purposes. In addition to narrowing the mandate, it is also important to mandate detailed checks and balances in the law, to minimise chances of abuse of powers.

The founding conditions cast a long shadow on the evolution of an organisation. If the law that establishes the data protection regime is rooted in careful consideration of the factors that are likely to shape its implementation, we will have a better chance of achieving good data protection outcomes.

 

Suyash Rai is a researcher at National Institute of Public Finance and Policy. The author would like to thank Renuka Sane, Anirudh Burman, Milan Vaishnav, and Ajay Shah for useful discussions. This essay is based on the comments the author has submitted to the Committee of Experts.

No comments:

Post a Comment

Please note: Comments are moderated. Only civilised conversation is permitted on this blog. Criticism is perfectly okay; uncivilised language is not. We delete any comment which is spam, has personal attacks against anyone, or uses foul language. We delete any comment which does not contribute to the intellectual discussion about the blog article in question.

LaTeX mathematics works. This means that if you want to say $10 you have to say \$10.