## Monday, May 10, 2021

### Backdoors to Encryption: Analysing an Intermediary's Duty to Provide 'Technical Assistance'

by Rishab Bailey, Vrinda Bhandari, and Faiza Rahman.

The rising use of encryption is often said to be problematic for law enforcement agencies (LEAs) in that it directly impacts their ability to collect data required to prosecute online offences. While certainly not a novel issue, the matter has risen to global prominence over the last four or five years, possibly due to the increased usage of privacy enhancing technologies across the digital ecosystem.

While there have been a number of policy proposals that seek to address this perceived impasse, no globally accepted best practice or standard has been evolved thus far. In India (as in many other jurisdictions), the government has increasingly sought to regulate the use of encryption. For instance, the recently announced Intermediary Guidelines under the Information Technology Act, 2000, seek to extend the "technical assistance" mandate of certain intermediaries to ensure traceability, by enabling identification of the first originator of the information on a computer resource. The scope of the term "technical assistance" has not been clearly defined. However, the provision appears to go well beyond existing mandates in the law that require holders of encryption keys to provide decryption assistance, when called upon to do so, in accordance with due process, and based on their capability of decrypting the encrypted information. Courts have also weighed in on this debate, with the Madras High Court and the Supreme Court hearing petitions that seek to create mechanisms whereby LEAs could gain access to content protected by end-to-end encryption (E2E), thereby enabling access to user conversations on popular platforms such as WhatsApp. A Rajya Sabha Ad-hoc Committee Report released in 2020 has also recommended that LEAs be permitted to break or weaken E2E to trace distributors of illegal child sexual abuse content.

Against this background, our recently released paper examines the scope of the obligations that ought to be imposed on intermediaries to provide "technical assistance" to LEAs, and whether that should extend to weakening standards of encryption, for instance, through the creation of backdoors. Broadly speaking the term "backdoors" refers to covert methods of circumventing encryption systems, without the consent of the owner or the user. The paper also evaluates, in brief, proposals for alternatives, such as the use of escrow mechanisms and ghost protocols.

We argue that the government should not impose a general mandate for intermediaries to either weaken encryption standards or create backdoors in their products/platforms. This can significantly affect the privacy of individuals and would constitute a disproportionate infringement into the right to privacy. Such a mandate will also likely fail a cost-benefit analysis, not least in view of the possible effects on network security as well as broader considerations such as growth of the Indian market in securities products, geopolitical considerations, etc. This however, does not mean that the law enforcement agencies have no options when faced with the prospect of having to access encrypted digital data. A first step in this regard would be to implement rights-respecting processes to enable law enforcement to access data collected by intermediaries in a timely manner. In addition, there should be greater focus on enhancing government and law enforcement capacities, including by developing hacking capabilities, with sufficient oversight and due process checks and greater funding to research and development efforts in the cybersecurity and crypto spaces.

This post seeks to throw light on the key issues around the encryption debate, and summarises our main arguments and suggestions on how India should address them.

### Understanding the encryption debate

Encryption is the process of using a mathematical algorithm to render plain, understandable text into unreadable letters and numbers (Gill, 2018). Typically, an encryption key is used to carry out this conversion. Reconverting the encrypted text back to plain-text also requires an encryption key. Depending on the manner of encryption, the same encryption key can be used to encrypt or decrypt information, or alternatively, one may require different encryption and decryption keys. Encryption therefore ensures that the message can only be read by the person who has the appropriate decryption key, particularly as newer forms of encryption make it inefficient, if not impossible, to reverse the encryption process (Gill, 2018).

Encryption essentially improves the security of information. It secures information against unwarranted access and ensures the confidentiality and integrity of data, thereby fostering trust in the digital ecosystem and protecting the private information of citizens and businesses alike.

However, the use of encryption can also enable criminals to "go dark", making it difficult for LEAs to carry out their functions. For instance, it is estimated that upwards of 22 percent of global communication traffic uses end-to-end encryption (Lewis et al, 2017). This puts a quarter of communications virtually out of reach for LEAs, not least as the use of modern encryption systems makes it harder for LEAs to use the traditional "brute force" method to access encrypted data (Haunts, 2019). LEAs therefore have increasingly called for limitations to be placed on the use of encryption so as to enable them to have access to information they require to pursue their law enforcement functions. They point to the need to ensure accountability for online harms, and therefore argue that intermediaries must provide them with all data relevant to an investigation.

The concerns with the use of encryption are driven by a number of factors such as the growing instances of cybercrime, the use of data minimisation practices such as disappearing messages and the use of encryption by default in various technology products. For instance, WhatsApp and Signal automatically encrypt communications in transit and also give users the option of automatically deleting their messages. Similarly, Apple uses encryption based authentication on its iPhones (which render the content accessible only if an appropriate passcode is provided. If not, the content on the phone could even be deleted after a certain number of failed attempts) (Lewis et. al, 2017).

These concerns have led to calls for Internet intermediaries to weaken encryption standards or create backdoors in their products/services. These demands are not new. Notably, the 1990s saw the issue being debated in the United States, with the FBI proposing the use of the "Clipper Chip", a mechanism whereby decryption keys would be copied from the devices of users and sent to a trusted third party, where they could be accessed on appropriate authorisation by LEAs. More recently, the FBI has been involved in face-offs with technology companies such as Apple, when it refused to provide exceptional access to an iPhone linked to a terrorist. In India too, the government has encountered similar issues - notably forcing Blackberry manufacturers to relocate their servers to India and hand over plain text of communications. The government also circulated a draft National Encryption Policy in 2015, which sought to implement obligations involving registration of encryption software vendors, and the need for intermediaries to store plain text of user data. The draft was however withdrawn after much criticism.

In response to such proposals, security researchers, cryptographers and service providers, have been near unanimous in pointing out that the creation of backdoors is likely to lead to significant costs to the entire digital ecosystem, especially as it leads to the entire population being exposed to vulnerabilities and security threats. Indeed, the need for stronger encryption and other security standards to protect user data is only heightened by the numerous and frequent data breaches that have been reported in India. Interestingly, even the Telecom Regulatory Authority of India has adopted a similar position in its Recommendations on Regulatory Framework for OTT Communication Services of 2020.

Even two commonly discussed methods of a "balanced solution" to the problem - the use of escrow mechanisms and ghosting protocols - have faced significant criticism. For instance, the use of escrow mechanisms (which, as with the Clipper Chip system described above, involve storage of the decryption key with a trusted third-party, who can then provide the same to LEAs when called upon to do so) is likely to lead to significant vulnerabilities being created in computer systems. Not only will such a system require faith in the integrity of the entity holding the decryption key, such an entity would constitute a single point of failure, which is poor system design (Kaye, 2015). Deployment of complex key recovery infrastructure is also likely to impose huge costs on the ecosystem (Abelson et al., 1997). Similarly, suggestions for using ghost protocols (which would require service providers to secretly add an extra LEA participant to private communications) have also faced significant criticism (Levy and Robinson, 2018). Given that this system would essentially require service providers to convert a private conversation between two individuals into a group chat, with a hidden third participant, critics have argued that it is just another form of a backdoor. It would erode trust between consumers and service providers, and provide for a "dormant wiretap in every user's pocket" that can be activated at will. This would also require fundamental changes in system architecture, thereby introducing vulnerabilities that can create threats for all users on platforms (Access Now et al., 2019).

Thus, while the use of such methods can enable LEAs to access user data more quickly than is currently possible, there are numerous concerns - from a civil liberties, economic and technical perspective. We outline the key concerns in this regard below.

### Concerns with mandating backdoors

• Privacy: In view of the recognition of privacy as a fundamental right, private thoughts and communications are protected from government intrusion subject to satisfaction of tests of necessity and proportionality. Mass surveillance can be considered to be per se disproportionate. It is recognised that government surveillance can lead to unwanted behavioural changes, and create a chilling effect. Encryption therefore serves as a method to protect individual privacy, particularly from government excesses.
• Security: Creating backdoors can weaken network security as a whole since it can be exploited by governments and hackers alike (Abelson et al., 2015). Backdoors can also lead to increased complexity in systems, which can make them more vulnerable to attack (Abelson et al., 2015).
• Right against self-incrimination: Mandating decryption of data can arguably also be seen as violating an individual's right against self-incrimination (Gripman, 1999; ACLU and EFF, 2015).
• Due process requirements: Criminal investigation in general and surveillance in particular is not meant to be a frictionless process. Introducing inefficiencies in the functioning of LEAs is what separates a police state from a democracy (Richards, 2013; Hartzog and Selinger, 2013). As is the case of due process requirements, encryption creates procedural hurdles, ensuring some checks and balances over the functioning of LEAs and the possibility of mass surveillance. It therefore helps re-balance the asymmetric power distribution between the State and citizen.

### Scope of "technical assistance": Should it extend to creating backdoors?

Given the aforementioned concerns, the question arises, should the duty of "technical assistance" that intermediaries are required to provide to LEAs, extend to the creation of backdoors or otherwise weakening encryption systems?

We argue that as far as recoverable encryption is concerned, i.e. encryption where a service provider already has a decryption key in the normal course of service provision, there is no requirement for such a mandate. Indian law already requires service providers to decrypt data in such cases, in addition to providing various other forms of assistance. Here, the need is to focus on implementing proper oversight and other procedural frameworks to ensure that LEAs exercise their powers of surveillance or decryption in an appropriate manner. We find however, that the Indian framework is lacking in this regard. There is no judicial oversight of decryption requests, no proportionality requirements in the law, and no meaningful checks and balances over decryption processes at all. We therefore proposed various changes in order to improve the transparency and accountability of the system. Further, research indicates that the primary problem of LEAs in India may relate to the relatively old and slow processes that must be used by LEAs when accessing data held by intermediaries, particularly those based outside India. This points more to the need for LEA data access processes to be revised/streamlined in accordance with modern needs.

As far as unrecoverable encryption is concerned, i.e. encryption where even the service provider cannot access the content (such as with E2E) as it does not have access to the decryption key, which is retained by the user, the situation is undoubtedly more complex. However, even in such instances, for the reasons elaborated above, we believe that mandating backdoors or weakening encryption is not an appropriate solution.

Moreover, LEAs already have multiple alternatives to collect information, including by accessing metadata and unencrypted backups of encrypted communications. They can also use targeted surveillance methods to conduct investigations (National Academy of Science, Engineering and Medicine, 2018). Indeed, the current Indian framework - governing telecom service providers in particular, but also other intermediaries - already gives significant and arguably excessive powers to the State. It should also be noted that LEAs in India are already using spying technology, as we saw in the Pegasus case. LEAs also have other covert methods of gathering data - from key-stroke logging programmes to exploiting weaknesses in implementation of encryption systems. While one cannot argue against the use of such systems in appropriate cases, it is clear that such powers must only be exercised through institutionalised processes, and importantly, subject to appropriate regulatory oversight. There is therefore a case for formulating a legal framework in India, along the lines of the US vulnerabilities equities process, to ensure due process even when the government resorts to exploitation of vulnerabilities within information systems for national security and law enforcement purposes.

Accordingly, we point to the need to carry out a more detailed cost-benefit analysis before deciding on the need to implement such a mandate (which unfortunately, has not been done in the case of the recent Intermediary Guidelines Rules). We point to how such a cost-benefit analysis should consider:

• Whether the use of unrecoverable encryption is indeed a significant hurdle for LEAs in collecting relevant information. While no data is available in this context in India, data from the US in the period 2012-2015 indicates that of the 14,500 wiretaps ordered under the Communications Assistance for Law Enforcement Act, only about 0.2 percent of wiretaps encountered unrecoverable encryption (Lewis et al., 2017). While this share has likely increased in view of the greater use of unrecoverable encryption in the ecosystem, a similar empirical analysis must be conducted in India to understand the impact of such types of encryption.
• The cost to intermediaries in changing their platform architecture are unlikely to be insignificant. It is also worth keeping in mind that often intermediaries will avoid using certain types of encryption purely to keep in the good books of LEAs in a form of "weakness by design". Notably, companies such as Apple and WhatsApp have dropped plans to encrypt user back-ups stored in the cloud. Such data can therefore be accessed by LEAs without compromising encryption.
• The risk of such laws getting caught up in global geopolitics. This has been the case for example, with Huawei and ZTE, who have faced significant international pressure in view of the Chinese government's purported ability to access data flowing through their networks.
• The possible effectiveness of such laws, considering that many criminals may use open source encryption or encryption from platforms that are not amenable to Indian jurisdiction. Further, the pace of technical development is difficult to keep up with from a regulatory perspective. Notably, institutions such as Europol and Interpol are increasingly concerned about the use of steganography (the technique of hiding the very existence of a message) and open source encryption by international criminals and terrorist groups. Therefore, even if there is a bar on using strong encryption, those who want to break this law, will continue to do so.

We therefore argue that while a mandate for targeted decryption or technical assistance may be constitutional if backed by a law with sufficient safeguards, a general mandate for the creation of backdoors (or an interpretation of the Intermediary Guidelines requirement to provide "technical assistance" to extend to such generic obligations) is unlikely to pass constitutional muster, assuming a high intensity of proportionality review is applied. A higher intensity of review will have to look at not just whether the proposed intervention would substantially improve national security, but would also need to engage with the fact that it would (a) compromise the privacy and security of individuals at all times, regardless of whether there is any evidence of illegal activity on their party, and (b) the existence of alternative means that are available to LEAs to carry out their investigations. Thus, we believe that a general mandate for creating backdoors will not be the least restrictive measure available.

### Conclusions and Recommendations

We argue that a general mandate that requires Internet intermediaries to break encryption, use poor quality encryption, or create backdoors in encryption is not a proportionate policy response given the significant privacy and security concerns, and the relatively less harmful alternatives available to LEAs. Instead, the Indian government should support the development and use of strong encryption systems.

Rather than limiting the use of certain technologies, or mandating significant changes in platform/network architecture of intermediaries that compromises encryption, the government ought to take a more rights-preserving and long-term view of the issue. This will enable a more holistic consideration of interests involved, avoid unintended consequences, and limit costs that come with excessive government interference in the technology space. The focus of the government must be on achieving optimal policy results, while reducing costs to the ecosystem as a whole (including privacy and security costs). A substantive mandate to limit the use of strong encryption would increase costs for the entire ecosystem, without commensurate benefits as far as state security is concerned.

The tussle between LEAs and criminal actors has always been an arms race. Rather than adopting steps that may have significant negative effects on the digital ecosystem, the government could learn from the policies adopted by countries such as Germany, Israel and the USA. This would involve interventions along two axes - legal changes and measures to enhance state capacity.

Legal changes that the government must consider implementing, include:

• Reforming surveillance and decryption processes, to clarify the powers of LEAs, and ensure appropriate transparency, oversight and review. It is also essential to standardise and improve current methods of information access by LEAs at both domestic and international levels. There must be greater transparency in the entire surveillance and information access apparatus, including by casting obligations on intermediaries and the State to make relevant disclosures to the public.
• Adoption of a Vulnerabilities Equities Process, such as that adopted in the United States, which could enable reasoned decisions to be made by the government about the disclosure of software/network vulnerabilities (thereby allowing these to be patched, in circumstances where this would not significantly affect security interests of the State). Such a process, while not without critics, does chart a path forward and must become central to the Indian conversation around due process in LEA access to personal data.
• Amending telecom licenses, which currently give excessive leeway for exercise of executive authority, without sufficient checks or safeguards.

Rather than implement ill-thought out policy solutions that would significantly harm the digital ecosystem and user rights, the government could also focus on enhancing its own capacities. This can include measures such as:

• Developing and enhancing covert hacking capacities (though these must be implemented only subject to appropriate oversight and review processes). To this end, there must be appropriate funding of LEAs, including by hiring security and technical researchers.
• Investing in academic and industry research into cryptography and allied areas. The government should also aid the development of domestic entities who can participate in the global market for data security related products. Enhancing coordination between industry, academia and the State is essential.
• Increasing participation in international standard setting and technical development processes.

To conclude, the crux of this issue can be understood using an analogy. Would it be prudent for a government, engaged in a fight against black money, to require all banks to deposit a key to their customer's safe deposit boxes with it? One would venture that this would be an unworkable proposition in a democracy. It would lead to people looking for alternatives to the use of safe-deposit boxes due to the lack of trust such a system will create. Innocent people will be exposed to increased risks. A preferable solution may be for the government to develop the ability to break into a specific safe deposit box, upon learning of its illegal contents, and subsequent to following due process. This would enable more targeted interventions, that would also preserve the broader privacy interests of innocent customers while protecting banks from increased costs (or loss of business).

### References

Gill, 2018: L Gill, Law, Metaphor and the Encrypted Machine, Osgoode Hall L.J. 55(2) 2018, 440-477.

Lewis et al., 2017: James Lewis, Denise Zheng and William Carter, The Effect of Encryption on Lawful Access to Communications and Data, Center for Strategic and International Studies, February 2017.

Haunts, 2019: Stephen Haunts, Applied Cryptography in .Net and Azure Key Vault: A Practical Guide to Encryption in .Net and .Net Core, APress, February 2019.

Kaye, 2015: David Kaye, Report of the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression, United Nations, Human Rights Council, May 2015.

Abelson et al., 1997: Hal Abelson, Ross Anderson, Steven Bellovin, Josh Benaloh, Matt Blaze, Whitfield Diffie, John Gilmore, Peter Neumann, Ronald Rivest, Jeffrey Schiller, and Bruce Schneier, The Risks of Key Recovery, Key Escrow, and Trusted Third-Party Encryption, May 27, 1997.

Levy and Robinson, 2018: Ian Levy and Crispin Robinson, Principles for a More Informed Exceptional Access Debate, LawFare Blog, November 29, 2018.

Cardozo, 2019: Nate Cardozo, Give Up the Ghost: A Backdoor by Another Nam et al.e, Electronic Frontier Foundation, January 7, 2019.

Access Now et al., 2019: Access Now, Big Brother Watch, Center for Democracy and Technology, et al., Open Letter to GCHQ, May 22, 2019.

Gripman, 1999: David Gripman, Electronic Document Certification: A Primer on the Technology Behind Digital Signatures, 17 J. Marshall J. Computer and Info. L. 769 (1999).

ACLU and EFF, 2015: American Civil Liberties Foundation of Massachusetts, the American Civil Liberties Union Foundation, and Electronic Frontier Foundation, Brief for Amici Curiae in Support of the Defendant-Appellee in Commonwealth of Massachusetts v. Leon Gelfgatt, 2015

Richards, 2013: Neil Richards, Don't Let US Government Read Your E-Mail, CNN, August 18, 2013.

Hartzog and Selinger, 2013: Woodrow Hartzog and Evan Selinger, Surveillance as Loss of Obscurity, Washington and Lee L.R. 72(3), 2015.

National Academy of Science, Engineering and Medicine, 2018: National Academy of Science, Engineering and Medicine, Decrypting the Encryption Debate: A Framework for Decision Makers, National Academies Press, Washington DC.

Rishab Bailey is a researcher at NIPFP. Vrinda Bhandari is a practising advocate. Faiza Rahman is a PhD candidate at the University of Melbourne.

LaTeX mathematics works. This means that if you want to say $10 you have to say \$10.