The program is up.
Search interesting materials
Sunday, November 30, 2014
Saturday, November 29, 2014
Consequences of reversing the controls on gold imports
Impact on the Current Account Deficit?
Restrictions were put on gold imports as part of the currency defence of 2013. One important step towards removing these restrictions was taken yesterday. Some people are thinking "How much will the current account deficit go up by, now that more gold will be imported?"
For this to happen, you have to believe that the restrictions actually made a difference to gold imports. I feel that there was absolutely no decline in the gold purchases by Indian households when the restrictions came in. All that happened was that instead of going to a formal sector firm, the purchases were made from smugglers, which required a whole chain of `black money' transactions.
If you feel that the bulk of gold smuggling is paid for using hawala transactions i.e. misinvoicing, it seems that shifting between smuggling and legal imports makes no difference to the current account deficit. Ergo, removing the restriction will make no difference to the current account deficit.
Scarcity value
I suspect that when we made it difficult to buy gold, there was a greater perception of scarcity, and people became even more keen to stock up on gold. It takes years of sustained good behaviour by the government, for people to get comfortable. One slip, and you unleash the psychology of scarcity all over again.
With gold, I think that after the liberalisation of the early 1990s, it took 5-10 years for people to get used to the idea that the government was not going to interfere with their gold purchases. These gains were lost in the 2013 rupee defence.
A similar story holds with outbound capital flows. Capital account restrictions came in 1942 and turned into a rigorous system of control in the 1950s. From that point onwards, Indian households have believed that there is value in holding assets outside the country. One big advance took place in the mid 1990s, when it became easy to travel abroad and use an Indian credit card. This solved the problem of needing hard currency to travel. But portfolio diversification remained a problem.
When the liberalisation of outbound flows began, this sense of scarcity started easing. Then came the capital controls on outbound flows as part of the currency defence of 2013. These controls have still not been reversed. This sent out the message that you cannot trust the Indian government, so it's wise to hold assets outside India as much as possible. The attraction of gold went up: buying gold in India is actually a capital outflow. The value of gold is safe from Indian monetary policy or capital controls.
After the actions of the currency defence of 2013 are undone, it will take years before people get back to the pre-2013 modes of behaviour. MOF might probably manage to get rid of the restrictions by 2014 or 2015, and perhaps by 2020 people will start shedding the scarcity value.
Wednesday, November 19, 2014
Reducing the cost of doing business in India: One example (ADR/GDR regulations)
by Pratik Datta and Ajay Shah.
Last week, the Ministry of Finance notified the Depository Receipts Scheme, 2014 (`2014 Scheme'). This replaces the 1993 Scheme with respect to depository receipts. The key materials on this subject are:
The 1993 Scheme was a haphazard set of interventions by the government in the working of the economy, without a clear rationale. The 2014 Scheme is logical and clear in the role of the government. Every policy intervention is viewed from the standpoint of market failures. If there are demonstrable concerns about consumer protection, micro-prudential regulation, systemic risk or resolution, they motivate interventions. Where there are no market failures, there is no case for intervention by the government.
As an example of the improved economic thinking underlying the 2014 Scheme: the 1993 Scheme embeds industrial policy with names of many industries. The 2014 Scheme eschews industrial policy.
Everyone interested in law and finance should print the 1993 Scheme and the 2014 Scheme and compare them, side-by-side.
Occam's razor can be adapted to the field of law with the idea that when there are multiple ways of drafting a particular policy choice into the law, the simplest should be preferred. The most sophisticated law is that which is the simplest. Simplicity, clarity and reduced legal risk have been achieved in the 2014 Scheme through many strands of thought.
Word count. The first test of simplicity is the word count. The 1993 Scheme (paragraphs 1 to 11) had 2984 words, while the new 2014 Scheme (paragraphs 1 to 11) has 1659 words - a reduction of 44.4% in usage of words. However, this difference is overstated as the previous Scheme dealt with FCCBs also while the new Scheme does not.
Readability. When we apply the Flesch-Kincaid readability test, the 2014 Scheme wins clearly, with a score of 30 when compared with the 1993 Scheme, which stands at 21.5.
Structured document. For a given word limit, a well structured document is more comprehensible. The 2014 Scheme follows the logical sequence of a depository receipt transaction. The 1993 Scheme was a haphazard mess.
Use of examples. When drafting the Indian Penal Code, Thomas Babington Macaulay intentionally included many terse, exemplary cases to illustrate the application of a provision. The Justinian Code and writings of Roman jurists persuaded him to make clear the legislative intent: `they are cases decided not by the judges but by the legislature, by those who make the law, and who must know more certainly than any judge can know what the law is which they mean to make'. This unusual innovative feature of the Code earned the praise of John Stuart Mill, who wrote: `besides the greater certainty and distinctness given to the legislator's meaning, [it] solves the difficult problem of making the body of the laws a popular book, at once intelligible and interesting to the general reader'. The 2014 Scheme includes examples which clarify the law.
Minimal set of defined terms. The 2014 Scheme uses a standardised set of defined terms. For example, it strictly uses a defined term - `international exchange' - on listing institutions. In contrast, the 1993 Scheme used three different terms to explain listing institutions on which depository receipts could be listed - `overseas stock exchanges', `over the counter exchanges' and `book entry transfer systems'. None of them were defined. This created legal risk.
Principles-based definitions. With the rapid development of technology, the concept of an `exchange' itself has evolved substantially. To make the law neutral to such evolution, the 2014 Scheme is technology-neutral and principles-based in its definition of `international exchange'.
Rationale statement. Finally, there is the backdrop of the drafting intent and rationale of the Scheme. The 1993 Scheme was not backed by a document articulating what was sought to be done. The 2014 Scheme is accompanied by the Sahoo Committee Report which performs this function. When faced with litigation in the future, practitioners and judges will be able to use this document to reduce legal risk.
You will recognise the IFC way here. Also see Umakanth Varottil on the 2014 Scheme.
The Indian system of capital controls comprises the FEM Act, regulations under FEMA, RBI circulars, etc. The 1993 Scheme is representative of the mess that is the Indian system of capital controls. The cost of doing business in India is being greatly raised by the badly thought out and badly implemented capital controls that are all over the landscape.
Capital controls on ADR/GDR issuance by Indian companies is the only element, out of the overall system of capital controls, where the Ministry of Finance is able to initiate reforms. In all other areas, RBI has veto power, and has blocked all progress.
As the years go by, it is increasingly difficult to justify these failures. RBI needs to find the intellectual capabilities to clean up the mess. There is a lot to learn from the clarity of thinking, and the implementation strategies used, in fixing the capital controls on ADR/GDRs.
Last week, the Ministry of Finance notified the Depository Receipts Scheme, 2014 (`2014 Scheme'). This replaces the 1993 Scheme with respect to depository receipts. The key materials on this subject are:
- The Sahoo Committee Report, which provided the intelletual framework, and drafted the 2014 Scheme.
- An article about this Report
- A brief video summarising the Report.
Improved economic thinking
The 1993 Scheme was a haphazard set of interventions by the government in the working of the economy, without a clear rationale. The 2014 Scheme is logical and clear in the role of the government. Every policy intervention is viewed from the standpoint of market failures. If there are demonstrable concerns about consumer protection, micro-prudential regulation, systemic risk or resolution, they motivate interventions. Where there are no market failures, there is no case for intervention by the government.
As an example of the improved economic thinking underlying the 2014 Scheme: the 1993 Scheme embeds industrial policy with names of many industries. The 2014 Scheme eschews industrial policy.
Improved drafting of law
Everyone interested in law and finance should print the 1993 Scheme and the 2014 Scheme and compare them, side-by-side.
Occam's razor can be adapted to the field of law with the idea that when there are multiple ways of drafting a particular policy choice into the law, the simplest should be preferred. The most sophisticated law is that which is the simplest. Simplicity, clarity and reduced legal risk have been achieved in the 2014 Scheme through many strands of thought.
Word count. The first test of simplicity is the word count. The 1993 Scheme (paragraphs 1 to 11) had 2984 words, while the new 2014 Scheme (paragraphs 1 to 11) has 1659 words - a reduction of 44.4% in usage of words. However, this difference is overstated as the previous Scheme dealt with FCCBs also while the new Scheme does not.
Readability. When we apply the Flesch-Kincaid readability test, the 2014 Scheme wins clearly, with a score of 30 when compared with the 1993 Scheme, which stands at 21.5.
Structured document. For a given word limit, a well structured document is more comprehensible. The 2014 Scheme follows the logical sequence of a depository receipt transaction. The 1993 Scheme was a haphazard mess.
Use of examples. When drafting the Indian Penal Code, Thomas Babington Macaulay intentionally included many terse, exemplary cases to illustrate the application of a provision. The Justinian Code and writings of Roman jurists persuaded him to make clear the legislative intent: `they are cases decided not by the judges but by the legislature, by those who make the law, and who must know more certainly than any judge can know what the law is which they mean to make'. This unusual innovative feature of the Code earned the praise of John Stuart Mill, who wrote: `besides the greater certainty and distinctness given to the legislator's meaning, [it] solves the difficult problem of making the body of the laws a popular book, at once intelligible and interesting to the general reader'. The 2014 Scheme includes examples which clarify the law.
Minimal set of defined terms. The 2014 Scheme uses a standardised set of defined terms. For example, it strictly uses a defined term - `international exchange' - on listing institutions. In contrast, the 1993 Scheme used three different terms to explain listing institutions on which depository receipts could be listed - `overseas stock exchanges', `over the counter exchanges' and `book entry transfer systems'. None of them were defined. This created legal risk.
Principles-based definitions. With the rapid development of technology, the concept of an `exchange' itself has evolved substantially. To make the law neutral to such evolution, the 2014 Scheme is technology-neutral and principles-based in its definition of `international exchange'.
Rationale statement. Finally, there is the backdrop of the drafting intent and rationale of the Scheme. The 1993 Scheme was not backed by a document articulating what was sought to be done. The 2014 Scheme is accompanied by the Sahoo Committee Report which performs this function. When faced with litigation in the future, practitioners and judges will be able to use this document to reduce legal risk.
You will recognise the IFC way here. Also see Umakanth Varottil on the 2014 Scheme.
Clearing the thicket of India's capital controls
The Indian system of capital controls comprises the FEM Act, regulations under FEMA, RBI circulars, etc. The 1993 Scheme is representative of the mess that is the Indian system of capital controls. The cost of doing business in India is being greatly raised by the badly thought out and badly implemented capital controls that are all over the landscape.
Capital controls on ADR/GDR issuance by Indian companies is the only element, out of the overall system of capital controls, where the Ministry of Finance is able to initiate reforms. In all other areas, RBI has veto power, and has blocked all progress.
As the years go by, it is increasingly difficult to justify these failures. RBI needs to find the intellectual capabilities to clean up the mess. There is a lot to learn from the clarity of thinking, and the implementation strategies used, in fixing the capital controls on ADR/GDRs.
Sunday, November 16, 2014
The imprecision of volatility indexes
by Rohini Grover and Ajay Shah.
A remarkable feature of options trading is that it reveals a forward-looking measure of the market's view of future volatility. This was first done by CBOE in 1993 with the S&P 500 index options, with an information product named `VIX' which reveals the market's view of future volatility of the US stock market index. CBOE computes and disseminates VIX every 15 seconds. VIX is often termed a `fear index' as it conveys the fears of the market. It has found numerous applications:
In this field, we treat VIX as a hard number -- we talk about a value of VIX such as 24.53 as if it is known precisely. But VIX is computed from a set of option prices. These option prices suffer from microstructure noise and from the limits of arbitrage. Each option price is only an imprecise reflection of the thinking of the market. This raises concerns about the extent to which imprecision in option prices spills over into imprecision of VIX.
In a recent paper, The imprecision of volatility indexes we offer a method for measuring the imprecision of VIX, and find that the measurement noise is economically significant.
VIX is a statistical estimator working on a dataset of option prices. Different estimators exist (e.g. the old VIX vs. the new VIX). Regardless of what estimator is used, the foundation — the option price — suffers from microstructure noise and is shaped by limits of arbitrage. Noise in option prices will induce noise in VIX. The only question is that of understanding how imprecise is the VIX.
For an analogy, consider estimation of LIBOR. Each dealer reports a reading of LIBOR. We recognise that each value obtained is a noisy estimator of the true LIBOR. This is aggregated using a simple robust statistics procedure to obtain an estimate of LIBOR. Bootstrap inference is used to obtain a confidence interval about how imprecise our estimator of LIBOR is. [Cita and Lien, 1992, Berkowitz, 1999, Shah, 2000].
A similar strategy can be applied to the measurement of VIX. Each option should be seen as a noisy estimator of the implied volatility. Bootstrap inference can then be used to create a distribution of the vega-weighted VIX (VVIX). This yields a confidence interval for VVIX.
As an example, for a sample of end-of-day S&P 500 options, on 17th September, 2010, the at-the-money (ATM) options with 29 days to expiry show significant variation and take values between 15% and 21%. Our methods yield a distribution of the estimated VVIX:
As the graph above shows, the point estimate for the VVIX is 21.53% and this is what all of us are used to talking about. But the noise in option prices has induced an economically significant imprecision in our estimated VVIX. The 95% confidence band, which runs from 20.8% to 22.32%, is 1.5 percentage points wide. This turns out to be an economically significant number: the one-day change in VVIX is smaller than 1.5 percentage points on 62% of the days. This suggests that on 62% of the days, we know little about whether VVIX went up or down when compared with the previous day.
To conclude, we got a huge step forward when options trading improved our knowledge of the universe by giving us a forward looking estimator of future uncertainty: a quantitative peek into the structure of expectations of traders. However, market microstructure noise and the limits of arbitrage hamper this. VIX is not a hard number; there is economically significant imprecision in our observation of VIX. This insight may make a difference to many applications of VIX.
Once we start using the term `estimation' about VIX, we must pursue improvements in the estimation of VIX in order to address these issues of microstructure noise and the limits of arbitrage. One first step in this direction is Grover & Thomas, Journal of Futures Markets, 2012.
A remarkable feature of options trading is that it reveals a forward-looking measure of the market's view of future volatility. This was first done by CBOE in 1993 with the S&P 500 index options, with an information product named `VIX' which reveals the market's view of future volatility of the US stock market index. CBOE computes and disseminates VIX every 15 seconds. VIX is often termed a `fear index' as it conveys the fears of the market. It has found numerous applications:
- Time-series econometrics processes historical data to help us make statements about the future. VIX brings a unique forward-looking perspective into time-series analysis.
- VIX measures uncertainty in the economy e.g. when examining the effect of macroeconomic shocks (Bloom, 2009).
- The international finance literature has emphasised the role of VIX in shaping capital flows to emerging markets [example]. E.g. it is interesting to look at what happened in India on the days in which a very large rise or fall of the VIX took place.
- Trading strategies can be constructed which employ VIX as a tool for making decisions for switching between positions.
- VIX based derivatives offer methods to directly trade on VIX. In the US, CBOE introduced VIX futures and options on March 26, 2004 and February 24, 2006 respectively. In 2012, the open interest for these contracts was at 326,066 contracts and 6.3 million respectively. In India, futures on VIX have been launched at NSE, but the contract has not taken off.
In this field, we treat VIX as a hard number -- we talk about a value of VIX such as 24.53 as if it is known precisely. But VIX is computed from a set of option prices. These option prices suffer from microstructure noise and from the limits of arbitrage. Each option price is only an imprecise reflection of the thinking of the market. This raises concerns about the extent to which imprecision in option prices spills over into imprecision of VIX.
In a recent paper, The imprecision of volatility indexes we offer a method for measuring the imprecision of VIX, and find that the measurement noise is economically significant.
VIX is a statistical estimator working on a dataset of option prices. Different estimators exist (e.g. the old VIX vs. the new VIX). Regardless of what estimator is used, the foundation — the option price — suffers from microstructure noise and is shaped by limits of arbitrage. Noise in option prices will induce noise in VIX. The only question is that of understanding how imprecise is the VIX.
For an analogy, consider estimation of LIBOR. Each dealer reports a reading of LIBOR. We recognise that each value obtained is a noisy estimator of the true LIBOR. This is aggregated using a simple robust statistics procedure to obtain an estimate of LIBOR. Bootstrap inference is used to obtain a confidence interval about how imprecise our estimator of LIBOR is. [Cita and Lien, 1992, Berkowitz, 1999, Shah, 2000].
A similar strategy can be applied to the measurement of VIX. Each option should be seen as a noisy estimator of the implied volatility. Bootstrap inference can then be used to create a distribution of the vega-weighted VIX (VVIX). This yields a confidence interval for VVIX.
As an example, for a sample of end-of-day S&P 500 options, on 17th September, 2010, the at-the-money (ATM) options with 29 days to expiry show significant variation and take values between 15% and 21%. Our methods yield a distribution of the estimated VVIX:
As the graph above shows, the point estimate for the VVIX is 21.53% and this is what all of us are used to talking about. But the noise in option prices has induced an economically significant imprecision in our estimated VVIX. The 95% confidence band, which runs from 20.8% to 22.32%, is 1.5 percentage points wide. This turns out to be an economically significant number: the one-day change in VVIX is smaller than 1.5 percentage points on 62% of the days. This suggests that on 62% of the days, we know little about whether VVIX went up or down when compared with the previous day.
To conclude, we got a huge step forward when options trading improved our knowledge of the universe by giving us a forward looking estimator of future uncertainty: a quantitative peek into the structure of expectations of traders. However, market microstructure noise and the limits of arbitrage hamper this. VIX is not a hard number; there is economically significant imprecision in our observation of VIX. This insight may make a difference to many applications of VIX.
Once we start using the term `estimation' about VIX, we must pursue improvements in the estimation of VIX in order to address these issues of microstructure noise and the limits of arbitrage. One first step in this direction is Grover & Thomas, Journal of Futures Markets, 2012.
Subscribe to:
Posts (Atom)