2016-10-22

Did the US Treasury read my blog?

In September 2014, in my blog Change of benchmark overnight index is a difficult task, I described why I thought that a change of benchmark for rate paid on collateral or as a reference for swap is a difficult task. The above blog was a reaction to comments from senior derivative figures indicating the opposite.

Fast forward two years, it seems that my message has been heard by the US Treasury. Yesterday, in another Risk article titled US Treasury: prepare for a post-Libor world (subscription required), one can read:

Daleep Singh, US department of the Treasury:
Any transition away from a dominant benchmark will surely be complex and lengthy[…]

To compare with my blog:
Change of benchmark overnight index is a difficult task.
Run, up to the maturity of the longest trade existing in the world linked to fed funds, two parallel markets.

Risk magazine
Concern among market participants […] A similar move could result in disputes between counterparties about the value of legacy portfolios.

To compare to my blog:
[...]but in the same way all the parties have to agree on the valuation (as an up-front fee or a fee over the life of the trade) impact of the transfer.

My task is not over, I still have to have my rant about the term "OIS discounting" (see the blog referenced above and my multi-curve book) heard better. There is still too much confusion between index referred in swaps, the fixed rate of a swap, the indexed referred in CSAs and the rate used for discounting.

My regular readers will have noticed that this is not the first time a subject discussed in my blog appears in the press a couple of months or years later. Recently it has been the case of FRTB / AD (see  Good news for AD? ) and coupon payment in IM (see Continuous dividend v discrete cash flows)

2016-10-16

Writing!

Picture taken in New York on the “Library Way”situated on East 41th Street.


It summarizes my feeling about writing, except that I have not reach the final stage, the stage of trouble. At least I don’t think so.

Writing your name can lead to writing sentences. And the next thing you’ll be doing is writing paragraphs, and then books. And then you’ll be in as much trouble as I am!
Jerome Lawrence and Robert E. Lee, The Night Thoreau Spent in Jail.

2016-10-03

Public analysis

I have open a new public repository on GitHub.

There was already a repository related to the code associated to my forthcoming book Algorithmic Differentiation in Finance Explained. The repository is:

https://github.com/marc-henrard/algorithmic-differentiation-book

I have made public a second repository, called analysis. The idea of the repository is to share analysis of financial situations by quantitative methods. Instead of simply reporting the description of the results through my blog, I will also made the code of the analysis publicly available when it make sense.

The new repository is available at

https://github.com/marc-henrard/analysis


The first code is the one used in the Imaginary Butterfly blog post.

As usual, comments are welcome.

2016-10-02

Book on its way!

After stalling for a couple of months, my book “Algorithmic Differentiation in Finance Explained” is moving again!

I have send the final draft to the editor this week-end. A couple of months for copy-editing and other technical processing are still required.

Maybe the book will be available by the end of the year. If you are looking for a useful, timeless, (and cheap) Christmas gift, don’t look further!

More details as soon as I receive them from the editor!

2016-08-09

Le franc restera-t-il le franc?

Going through boxes of old books inherited from my grand parents, I found a small book from 1945 titled “Le franc restera-t-il le franc?” [1]. It could be translated in English and in today’s vocabulary by “Will the Euro stay the Euro?” This is a good question!

Reading through the introduction and the first chapters, I found it very current. The last sentence of the introduction reads (my translation):
Why always bury our heads into the sand, close our eyes, use subterfuges, adopt monetary palliatives instead of taking a dispassionate view to certain truths that are not nice to say but that are inexorably revealed by the mathematical deduction of the laws of economics.

Yes, a very current question indeed.

[1] Charles PĂ©rin, 1945, “Le franc restera-t-il le franc?”, Larcier, Bruxelles.

2016-06-26

Book review

I have recently found that Massimo Morini wrote a review of my "Multi-curve framework" book in Quantitative Finance. The text of the review can be found on the Quantitative Finance web-site (subscription required). I'm honored by the nice words of Massimo.

The only minor point I have to disagree with him, is when he says that "I do not think that Marc possessed any prophetic powers." As explained in a previous entry of my blog (see the last part of When fiction becomes reality), the multi-curve framework is not the only proof of my prophetic powers … or of my luck. But consistent above average luck is even more rare than consistent forward thinking, so I'm fine with either!

2016-05-21

Good news for AD?

Maybe I did not lose as much as I though in Sometimes you win, sometimes you lose. Maybe Algorithmic Differentiation (AD) is not forbidden by regulators as much as I feared.

A couple of months ago I attended and spoke at the 5th xVA conference. I was also part of the panel discussion on “GPU vs AAD” debate. Among the questions opened for the debate was "Can AD be used for regulatory/FRTB purposes?"

I explained the reason behind that question in my previous blog and in my comments to the consultative document by the BCBS. My comments can be found, among other comments, on the site of the Bank for International Settlements. The regulatory documents give a precise (but incorrect in my opinion) description of the meaning of sensitivity. The sensitivity for interest rate is the forward differentiation quotient computed with exactly a one basis point shift.

The good news, if confirmed, is that the previous bad news may not be true. A staff member from the Federal Reserve Board, which has been involved in the discussion around FRTB, indicated that the wording should not be understood has preventing AD. The "definition" is supposed to be a general description of what sensitivity should be and its scale, not the numerical procedure. In some sense the letter of the law is not the letter of the law anymore.

It is still questionable that the text, which is precise in its wording, does not mean exactly what it says. Also it is not a first draft imprecision, it is a much commented text for which this precise question was asked publicly and officially, including by me in my comments published on the web page relate to the issue by BCBS. A simple and clear alternative wording was proposed.

This was a simple one line change with respect to the wording used in the draft. Why was this not clarify? Did the BCBS read all the comments? Are the drafting staff and the staff reading the comments collectively well versed and experienced in all the quantitative finance and numerical analysis aspects described in the rules and its comments? I don't know the answer to those questions, I can only guest from the final outcome. This document will impact the management of trillions and trillions of balance sheets in the world, you could expect a document quality in line with was is expected as documentation for model validation purposes in a tier 3 bank in relation to models used for notional of a couple of millions in derivatives.

At this stage, I hope that the remark by the Fed staff member will be clarified officially by the BCBS or local regulators. This should come at least as an official interpretation of the document and preferably as a footnote in the original document.

2016-04-24

A double MAC, please.

Recently concerns related to MAC swaps and CME swap futures based on those swaps have resurfaced in the press. See the letter from ISDA/CME/SIFMA/FIA and the article in Risk Magazine (subscription required): MAC swaps, swap futures face new tax threat.
Similar issues have been discussed in the past: Tax questions cloud prospects for CME swap future and MAC swaps. 

MAC what?


MAC stands for Market Agreed Coupon. “Market Agreed” is probably a misnomer as nobody other than the trade participants has to agree with the coupon. The real meaning is a swap with a fixed coupon at a rounded figure, usually every 25 bps, and the trade is not done on the fixed rate but on the price. For example a quote can be “a 10Y receiver, 2.25% coupon swap at a price of 97bps” or something similar. The price is paid as a one-off fee at the settlement or effective date of the swap. In the sequel I will call “continuous coupon swap”, the swaps where the coupon is not necessarily one of the discrete coupons, the swap is traded in term of the fixed rate paid and no up-front fee is paid.

Problems


The concerns related to those instruments are from a tax perspective and also from a risk perspective. Those two very different types of concerns have their roots in the same feature of the swaps, which is the up-front (non-periodic) payment of a ‘fee’.

Tax: One of the issues with this type of swaps is its accounting treatment. Under the accounting rules of some countries the fact that there is an up-front payment may lead to the classification of the transaction as a loan instead of a derivative for tax purposes.

Risk: The hedging of interest rate risk with this type of swaps is quite different of the one with continuous coupon swaps. If you use a standard curve calibration mechanism with the MAC swaps quoted on price, you obtain a delta risk figure which is fundamentally different from the one of a continuous coupon swaps. Roughly speaking the risk figure for a 10Y swap will be 10 times smaller. The price has roughly a sensitivity of 10 with respect to the rate. It is similar to a view of a bond world in term of price change by opposition to yield change. For example a “parallel” movement of the rate curve becomes a movement of the price curve roughly proportional to the maturity.

Why introducing those problems?


Before trying to solve those problems in a way or another, maybe we can try to understand why the feature was introduced in the first place. A fixed coupon is attractive to people as different trades can be compressed in a unique trade. The standard MAC swaps trade often come not only with standard coupons but also with standard effective dates, usually the quarterly IMM dates. The swaps can be traded over a quarter and accumulated into a single position with full offset between payers and receivers.

With the standard (and soon mandatory) collateral (variation margin), the upfront payment is not really a payment anymore. The amount paid is immediately received back as a VM (see my previous post Continous dividend v discrete cash flows for a discussion on the “immediately”). It is not a cash play but a curve play. The difference between a MAC swap and a standard swap is a little bit of curve play between the overnight amount (upfront payment) and the regular payments on the coupons.

Is this the best solution?


Before proceeding further in the analysis, we have to note that when we say “fixed” or “standard” (or “agreed”) coupon, it has to be understood as standard for a given period. If the market move enough, the new standard will be different. If the continuous coupon swaps trade a 2.15%, the MAC coupon will be 2.25%; if the continuous coupon swaps trade at 2.10%, the MAC coupon will move to 2.00%. What we described above as a unique coupon portfolio is actually a multiple coupon portfolio with the coupons at discrete values.

It would be possible to achieve the same results while still quoting the swaps in term of coupon and have no up-front fee. One method is to deliver not one swap with the quoted coupon but a portfolio of two swaps with discrete coupons (the MAC-like coupons) and notional of those swaps such that the weighted average of the coupons is the quoted coupon. Suppose that the 10Y swap is quoted at 2.10% for a notional of 100m. The delivery would be two swaps, one with a coupon of 2.00% and a notional of 60m and one with a coupon of 2.25% and a notional of 40m. The cash flows generated by those 2 swaps are identical to the cash flows of the unique swap with coupon 2.10%.

In general the equations to solve for the notional n0 and n1 related to a swap of notional n with coupon c which is between s0 and s0, are
n0 + n1 = n
and   n0 * s0 + n1 * s1 = n * c
A system of two equations and two unknowns easily solved as
n1 = n * (c - s0) / (s1 - s0)
and   n0 = n * (s1 - c) / (s1 - s0)

This process has (almost) all the properties of the original approach and none of the drawbacks. There is no up-front fee and the risk management is done in term of coupon/yield. When several trades with same dates are traded, they can be netted easily. The only draw back is that if you trade only one swap, you will have two swaps in your books. But as soon you trade many of them with the same counterparts, like in the cleared IRS case, the netting comes into play automatically.

We need a name for this new approach. My vote goes to “double MAC”, hence the title of this blog.

And swap futures?


Swap futures can also be designed with a similar idea in mind. The natural dimension in the swap market is the rate. One can create a swap that is quoted in term of rate and have the daily margin based on the rate multiplied by the notional and multiplied by a fixed amount representing the PVBP. For a true swap, the PVBP is changing with the level of the market. We could design the swap future with a fixed PVBP, let say 10 for a 10Y underlying and work from there. This type of swap futures requires a convexity adjustment, but anyway you already need one due to the daily margin feature of futures. The valuation of this product would require mixture of short term CMS with futures pricing. The quotation mechanism would make it the best instrument for hedges. Suppose that the notional for the swap futures is 100,000. You see a PV01 of 123,400 by basis point on you screen for the 10Y bucket, the hedge ration is 10*100,000/10,000=100. You need 1,234 contract to hedge that risk. Easy, isn't it? At expiry of the futures, two cleared swaps would be delivered, a double MAC with coupons and notionals selected to reproduce the last futures price. Notional can be selected to reproduce the futures theoretical notional or to conserve the total PVBP at delivery. The latter makes sense for those who view the futures as hedges and want to have a smooth risk transition at expiry.

I proposed that type of swap futures some years ago, but it was never quoted on any exchange. Maybe it is the right moment to propose the idea again. This can be viewed as a new episode of my series on “financial fiction” that started some time ago.

Note also that those rate quoted swap futures lead naturally to swaption futures. Swaptions trade in price on an underlying with a given coupon/strike and no upfront payment on the swap. The swaption futures would have a quoted price (and daily margin) and a strike rate which has the same meaning as in the swaption case. At expiry of the swaption futures the parties enter into a (cleared) swap or not at the choice of the party long the swaption futures.

I presented my proposal at the “The 4th Interest Rate Conference” in March 2015. I have also numerous (non public) documents with analysis of the convexity impact, the hedging mechanism, detailed settlement and delivery process and so on.

If you have an interest in those mechanisms, to implement them in the OTC market or in the futures market, don’t hesitate to contact me for more details. You may also be interested by my analysis of the pricing of swap futures quoted in price which is available in a SSRN working paper: Deliverable Interest Rate Swap Futures: Pricing in Gaussian HJM Model.

2016-03-30

Continuous dividend v discrete cash flows

Variation margin


With the generalization of variation margin collateral, the derivative world is not driven anymore by discrete cash flows but continuous dividend. This can be explained with the following two graphs.
Figure 1: Derivative value

Suppose that you have entered into a derivative in the past. In the graphs that date was 20 days ago and the X axis represent the time. You entered into the trade at fair price, so the initial value was 0. Time has past and the value has gone up and down. The Y axis of the same graph is the value. The current value is positive. As the party to the trade are uncertain that their counterpart will honor its derivative obligations, it is now standard to ask for variation margin related to the derivative. In this context variation margin is the exchange on a daily basis of collateral to guarantee the obligation. The party out-of-the-money (for which the value is negative) is posting financial instrument with the same value as the trade as a guarantee. In the interbank market, the legal agreement around those margins are usually described into the CSA (Credit Support Annex) between the financial institutions. The standard approach is to pay to variation margin in cash. In the graph, the multiple line for the date after today (0) represent the uncertainly around the future. For the future another concept, Initial Margin enter into account. But this is the story for another day.

Due to the legal link between the derivative and the CSA, the representation of the first graph is only partial and one could argue it is incorrect. The derivative does not exists on its own anymore. You can not look at it and compute its “value” as if the rest of the agreement did not exists. If you combine the derivative with the obligation to pay back the collateral received, you obtain the second graph. The total value is always zero, or more exactly is reset to 0 every day. There is a continuous, or more exactly daily, "dividend" payment.
Figure 2: Derivative cash flows
What is the amount of this dividend? It is the exchange of collateral, itself linked to the value of the derivative. The dividend is the change of value of the instrument minus the interest paid on the existing collateral. In continuous time, the equation is written as
dDt = dVt - ct Vt dt
(see Equation 8.1 in the Multi-curve framework book).

Margin payment in practice


In practice, today, continuous time does not exists and everything is done on a discrete basis with one day being the standard atomic amount of time. The dividend paid is thus (with time written in days)
(Vt - Vt-1) - ct-1 Vt-1
This is the amount computed at date t, but when is it paid? In theory it should be in t, but in practice you need some time to do the actual computation, agree with your counterpart the amount, send the payment message, etc. Without taking into account potential disputes, the payment is done at best in t+1. I say at best, as when institutions are not in the same time zone, t+1 becomes relative and can become t+2 in practice. To clarify the explanation, suppose that the margin is always paid in t+1.

The derivatives also have actual coupon payments; interest rate swaps have regular payments of fixed or floating coupons. How do the coupon flows interfere with the daily margin payments? Let's start with the theoretical description and introduce the t+1 payment later. As an example, suppose that the next coupon payment, in time t,  has a amount of 100 and the value of the rest of the derivative is small. Suppose the values are in
t-3 t-2 t-1 t t+1
99.0 99.5 99.8 0 0.1
Note that for the valuation in t in this table, the coupon to be paid in t is not included in the valuation.
The payments are:
Date Formula Amounts Cash flow
t-2 Vt-2 - Vt-3 99.5 - 99.0 0.5
t-1 Vt-1 - Vt-2 99.8 - 99.5 0.3
t Vt - Vt-1 + coupon 0.0 - 99.8 + 100 0.2
t+1 Vt+1 - Vt 0.1 - 0.0 0.1
Very smooth with small payments everyday. Exactly what we wanted/expected.

No introduce the t+1 payment for the margin:
Date Formula Amounts Cash flow
t-1 Vt-2 - Vt-3 99.5 - 99.0 0.5
t Vt-1 - Vt-2 + coupon 99.8 - 99.5 + 100 100.3
t+1 Vt - Vt-1 0.0 - 99.8 -99.8
t+2 Vt+1 - Vt 0.1 - 0.0 0.1
Very smooth with small payments everyday, except around the coupon where there are large spikes, positive then negative. Exactly what we don't want. We have recreated the credit risk we wanted to avoid by introducing the variation margin.

This risk related to spiked exposure is for example described in a recent paper by L. Andersen, M. Pykhtin and A. Sokol: Rethinking Margin Period of Risk.

A different approach?


To avoid this problem, the CCPs have introduced a different way to proceed; they pay the coupons also in t+1. The cash flows are then
Date Formula Amounts Cash flow
t-1 Vt-2 - Vt-3 99.5 - 99.0 0.5
t Vt-1 - Vt-2 99.8 - 99.5 0.3
t+1 Vt - Vt-1 + coupon 0.0 - 99.8 + 100 0.2
t+2 Vt+1 - Vt 0.1 - 0.0 0.1
Very smooth with small payments everyday. Only the coupon is paid on the wrong date.

Can we do better? Certainly! Obviously you can not forecast the market perfectly and know in t-1 what the market will be in t, but you can certainly compute a forward value. The exact meaning of this forward value and how the forward market is estimated is not very important, what is the most important part is to include the cash flows in that valuation. The new valuation are

Computation t-4 t-3 t-2 t-1 t
Value date t-3 t-2 t-1 t t+1
Value 99.1 99.6 99.9 0 0.1

The cash flows are in this approach
Date Formula Amounts Cash flow
t-2 Vt-2 - Vt-3 99.6 - 99.1 0.5
t-1 Vt-1 - Vt-2 99.9 - 99.6 0.3
t Vt - Vt-1 + coupon 0.0 - 99.9 + 100 0.1
t+1 Vt+1 - Vt 0.1 - 0.0 0.1

To have this approach working, it is important that in t, the two parts of the cash flow, the one related to the variation margin and the one related to the coupon are paid on a netted basis. If not you introduce a "Herstatt"-like settlement risk even in single currency. To be netted, the payment of the two parts have to be in the same currency. But this is the case for most of the market (in traded notional), the variation margin and the instrument are denominated in the same currency.

Implementation


The requirements to obtain this clean margining process are at the same time easy and difficult. The CSA have to be rewritten to incorporate this forward approach to margin related valuation and payment processes for coupons and collateral have to be merged into a unique process. This require rewriting all CSA agreements. A huge process, but this need to be done anyway in the coming months to take into account the new regulation related to bilateral margining. A good time to introduce the changes. The second part require changes in back-office processes and a more global approach to risk management processes. This is maybe where the resistance will be the largest.

Note that the new regulation on bilateral margin impose to compute initial margin related to a 10 days margin period of risk or close-out period. If no mechanism is introduced to smooth-out the exposure spikes, this amount at risk will need to be included in the IM computed. That would introduce a new spike, an asymmetrical in IM this time to protect for the spike in VM/coupon payment. That spike will be worst as the IM has to be in segregated accounts and no netting or reuse of funds is allowed. It will be "interesting" to see if the regulators take into account that type of exposure in their validation of bilateral internal model. The mandatory bilateral margin takes effect in September 2016 for the largest derivative users; their models will need to be validated (by regulators in Europe, US, etc.) by that date. A lot of validation activity will take place in the coming months.

In summary


Timing difference between variation margin and coupon payment reintroduce a temporary credit exposure that is supposed to be remove by the variation margin procedure. This residual risk can be removed by relatively small changes in the margin procedure. Those small changes involve changes in CSA wording and changes to payment processes that should be unified inside a financial institution. This is probably the right moment to do it as CSA and procedure have to be fundamentally reviewed to cope with regulatory changes taking place in less than 6 months.

As usual, don't hesitate to contact me for advisory work around market infrastructure changes, derivatives valuation or  risk management.

2016-03-28

Workshop on Margin

In the last years, I have reviewed the methodologies of the world largest OTC swap CCPs, read most of the recent related regulatory changes, research the impact of those changes on valuation and implemented most of them in libraries. Those exercises combined with my background in quantitative analysis and trading give me a unique perspective on the changes in the derivative market infrastructure.

As part of the recent advisory engagements related to the above subjects, I ran several workshops on margins in Europe and in the US. Those workshops have been offered as one day or two days programs.

Below are a short summary and the agenda of a typical workshop. The workshops are always tailored to the audience. Don't hesitate to contact me for more information or to request a similar workshop in-house.

Summary


One impact of the crisis has been the increase of the spread between different reference rates. Another impact has been the regulatory efforts to try to reduce the systematic credit risk in the inter-dealer market. One direction has been to push vanilla instruments to central counterparties (CCPs) for central clearing and another one is the mandatory bilateral margin for transaction. Those changes have generated a new approach to the valuation and risk management of derivatives, in particular to the vanilla one. The first part of the workshop describes the impact of Variation Margin on the valuation. This is often referred to as the collateral discounting framework and has ramification in swap valuation, curve calibration, risk management in presence of collateral, with the foreign collateral case an important example. In the second part we focus on Initial Margin (IM). IM is one of the CCPs risk management tools; we analyse the specific methodologies used by the main CCPs. IM will become an important part of bilateral relation as it will made mandatory from September 2016; we analyse different current proposal for a standard market methodology.

Previous delivery agenda

 

Part 1: Variation margin

  • Margin: terminology and fundamentals
  • Clearing/variation margin/initial margin/capital: a regulatory time table
  • Multi-curve framework
  • Valuation under (variation margin) collateral: Overnight rate collateral, Foreign currency collateral, Bonds collateral
  • What information about change of collateral valuation is available in the market?
  • Curve calibration under collateral: OIS discounting and foreign currency collateral
  • Risk management of interest rate risk with foreign currency collateral - sensitivities, basis / Impact on IM methodologies

 

Part 2: Initial margin

  • CCP specific initial margin methodologies: LCH, CME, Eurex ; CME/LCH basis
  • Bilateral margin: regulatory requirements (focus on Europe/EMIR and/or US regulation)
  • ISDA(R) SIMM proposal (related to FRTB SBA standard approach): main feature, implementation
  • Comparison between CCP margin and bilateral requirements/SIMM
  • Pro-cyclicality features of CCP methodologies - volatility stress-test / Estimation of future Initial Margin
  • Cost of clearing: New member of the xVA family: MVA
  • Numerical challenge of the MVA computation; some methods to reduce the numerical burden
  • Conclusion: Review of the regulatory time-table and the impact on the workload for financial institutions and regulator

Slides and lecture notes are made available to participants at the end of the day.

2016-01-24

Imaginary butterfly

Yesterday night, I finish reading a book. This morning, I was wondering what to do next. Weather is not good enough to play golf, and I was reduced to start a reading a new book or let my imagination wander. I choose the latter, and from imagination to imaginary there was only a short step. And from imaginary to imaginary capital and FRTB another short step.

So I decided to create a simple swap position with imaginary capital according to the standard approach of the FRTB.
  • Creating the matrix in rule 77: 10 lines of (Matlab) code.
  • Finding the not positively defined sub-matrices of dimension 3 of the qbove: 10 lines of (Matlab) code
  • Creating an actual position and computing its risk weighted sensitivity and FRTB capital: 50 lines of (Java) code
  • Result: priceless!
Priceless has to be understood as the capital required to hold that position is imaginary, and you can not buy imagination for any price!

Maybe a little bit more details about the computation. First step is creating the matrix described in rule 77. The result is:
Tenor 0.25Y 0.50Y 1Y 2Y 3Y 5Y 10Y 15Y 20Y 30Y
0.25Y 1.0000 0.9704 0.9139 0.8106 0.7189 0.5655 0.4000 0.4000 0.4000 0.4000
0.50Y 0.9704 1.0000 0.9704 0.9139 0.8607 0.7634 0.5655 0.4190 0.4000 0.4000
1Y 0.9139 0.9704 1.0000 0.9704 0.9418 0.8869 0.7634 0.6570 0.5655 0.4190
2Y 0.8106 0.9139 0.9704 1.0000 0.9851 0.9560 0.8869 0.8228 0.7634 0.6570
3Y 0.7189 0.8607 0.9418 0.9851 1.0000 0.9802 0.9324 0.8869 0.8437 0.7634
5Y 0.5655 0.7634 0.8869 0.9560 0.9802 1.0000 0.9704 0.9418 0.9139 0.8607
10Y 0.4000 0.5655 0.7634 0.8869 0.9324 0.9704 1.0000 0.9851 0.9704 0.9418
15Y 0.4000 0.4190 0.6570 0.8228 0.8869 0.9418 0.9851 1.0000 0.9900 0.9704
20Y 0.4000 0.4000 0.5655 0.7634 0.8437 0.9139 0.9704 0.9900 1.0000 0.9851
30Y 0.4000 0.4000 0.4190 0.6570 0.7634 0.8607 0.9418 0.9704 0.9851 1.0000

The next step is checking if there are sub-matrices of dimension 3 that are not positively defined. There are in total 120 possible combination of that type of matrices. A number small enough that one can scan all of them and look for negative eigenvalues. This is probably not the most efficient numerical procedure but as it takes less than a second to run it, this is not the most important part.

Out of those 120 possible matrices, 34 (!) are not positively defined. I selected the “worst”, according to a arbitrary criterion, for the sequel of this analysis. This is the sub-matrix with indices (2, 5, 8), i.e. the nodes 0.5Y, 3Y and 15Y. The smallest  eigenvalue is -0.0042 and the associated eigenvector [-0.4401, 0.7638, -0.4721]. The Matlab code for that part is provided below and should be applied to the “cor” matrix presented above.

% Copyright (C) 2016 by Marc Henrard

 c3 = nchoosek(1:10, 3);  
 nbProb=0;  
 for(i=1:length(c3))  
   e = eig(cor(c3(i,:), c3(i, :)));  
   if(min(e)<0)  
     nbProb=nbProb+1;  
     b3(nbProb, :) = c3(i,:);  
     be3(nbProb,:) = e';  
   end  
 end  
 nbProb  
 [m, ind] = min(be3(:,1))  
 b3(ind, :)  
 be3(ind, :)  
 B3 = cor(b3(ind, :), b3(ind, :));  
 [V, D] = eig(B3);  
 p3 = V(:,1)  


Creating a positively defined matrix than can be used as a correlation matrix and at the same time looks plausible for finance application is not trivial. Throwing random numbers or made-up functions to create the different elements is not likely to produce an acceptable result. From the above description, the functional form proposed by the BCBS is clearly not appropriate.

We have now enough information to create a simple example with imaginary capital. Select a position that will create a risk weighted sensitivity close enough to the above eigenvalue. I selected the following positions in EONIA ATM swaps with maturities 0.5Y, 3Y and 15Y: -400m, 150m and -20m with negative number for a receiver and positive number for a payer. This is a type of butterfly position. Entering the above position in my efficient tools, I get immediately the following sensitivity vector:
[0.00, -202516257.66, 0.00, 0.00, 460337138.18, 0.00, 0.00, -290770969.62, 0.00, 0.00]
The above figures were obtained with data from 20-Nov-2015. The sensitivities are computed by algorithmic differentiation; they do not fit the “bad” definition of sensitivity used in the FRTB, but are close enough for the result I want to illustrate.

Now let the FRTB capital computation tool apply the risk weights and the (pseudo-)correlation to that position and you get:

FRTB capital computed involves the square root of a negative number
  |-->; Partial capital is: 0.0 + 2100828.445881 i

What does that mean in practice? I don't have the slightest idea, but it certainly looks very cool! And ...

I have my imaginary butterfly!

Beyond the funny result and tone used in this blog, there is a serious problem. The careless construction of the approach make the result completely meaningless. The credibility of the financial regulators will decrease further, if possible, and the number produced by this new regulation will fill the boxes of costly and pointless reports.

If you have an interest in reducing the capital of your swap book, don't hesitate to contact me for consulting work.

2016-01-23

Sometimes you win, sometimes you lose

The BCBS published it new “Minimum capital requirements for market risk”. When the last consultative paper on the Fundamental Review of the Trading Book was published in December 2014, I send my comments to the BCBS. The blog related to my comments is here.

From my comments, I would say there are some wins and some losses. Unfortunately, the losses outweighs the wins.

Win


One of my comments was regarding the scaling of the numbers. The Committee had clearly proposed incoherent numbers with final capital 10,000 times larger than the expected numbers. A missing basis point in some formula. Fortunately the Committee has corrected that.

The correction was not in the sense I expected. Roughly the capital required for interest rate risk is the “PV01” multiplied by a risk weight. The choice was between describing PV01 as the “present value of one basis point” or dividing the weights by 10,000. The Committee selected the latter!

This means that the official definition of PV01 for capital computation is the “present value of one”, with one being 100% or 10,000 basis points.

From a quantitative library developer point of view, this is a very good news. It has been one of my battles to ensure that the number computed by the libraries I develop are non-scaled. The sensitivities are the partial derivatives (in the mathematical sense of derivatives) of the outputs with respect to the inputs. No scaling is done anywhere in the quant library; no one basis point multiplication here and one percent there. Everything is non-scaled inside the quant part, only the loader of external data and the display/export are allowed to do any scaling. Allowing even one scaling within the library would force all the numbers to be augmented by some metadata indicating which scaling was used. That would create a real nightmare and heavy computation burden on the library.

Now I can use the “official” definition of PV01 by the committee to claim that no scaling should be done. That will allow me to use an authority and legal argument with people that don’t understand rational arguments, like accountants and lawyers (sorry to my accountant and lawyer friends).

But this victory is a pyrrhic victory!

Loss


In the document, the definition of the sensitivity is, with simplified notations,
 s = ( V(r+0.0001) - V(r) ) / 0.0001

In other words, the official definition of sensitivity is a forward finite difference differential ratio with a fixed shift of one basis point. As I wrote in my previous blog on the subject, if you have implemented Algorithmic Differentiation in your libraries, too bad for you, you cannot use it. You have to implement a new approach using finite difference and multiply you computation time by 5 on your vanilla swap book and maybe by 10 for cross currency swaps. The required formula creates numerical instability in the sensitivity from you tree methods, never mind, this is the regulation.

One could think that this is simply a quick draft document and that this is a minor mistake from the drafter confusing the goal with one of the methods available to achieve it, but it is not. There were several versions of the document with more than a year since the latest draft and the above comments were officially made in the request for comments. It is thus a deliberate decision. This means that the number computed for regulatory purposes will not be used for any other purposes and will add to the cost of business. Why would a risk manager use the regulatory numbers if it takes him 10 times more time to produce them than his correct numbers? New code will have to be written in firm wide systems just to compute those numbers. Regulation becomes a drag on business without up-side. A separate process will be run on a monthly basis and largely ignored. Any error in the process will not be scrutinized on a permanent basis by traders and risk managers and is thus very is likely to be ignored. Funnily the regulation requires (47. (c)) the computation of the delta to be based on “pricing models that an independent risk control unit within a bank uses to report market risks or actual profits and losses to senior management”, but the requirement indicated on pricing model is not extended to the delta! The document adds (49.) “a bank’s pricing model used in actual profit and loss reporting provide an appropriate basis for the determination of regulatory capital requirements for all market risks” but again does not extend the “appropriateness” to the delta/sensitivity computation.

This is a bad decision for Algorithmic Differentiation (and the sales of my forthcoming book). This is a bad decision for unified risk management, bank will produce several sets of numbers for different purposes. This is bad for the credibility of regulators.

I’m pointing at the finite difference part of the formula, mainly because I have effort in developing Algorithmic Differentiation in mind. But I could have taken any part of the formula and spend paragraphs discussing them. What is “r” is the formula? A market rate! Which market rate? Not indicated! What if there is no liquid rate in the market that satisfies the requirements? Not indicated! What is the convention/day count for “r” and “0.0001”? No indicated! What is the 0.25 year market instrument for a 6M Libor? Bad question! If the market quote is in spread (e.g. cross-currency), how should that be used? Not indicated!

Again this is not a drafting mistake in a first quick draft. This is a document that took several years to be written, has been commented by numerous people and those questions have been asked explicitly in the request for comments.

Overall, I’m disappointed. 90 pages of detailed requirements, but from a quant perspective missing the essential. Prescriptive where it should be principle based and vague where it should be precise.

I may review the details of the capital computation at some stage. This is based on a two layers delta-normal-like approach with conventional volatilities (weights) and “correlations”. I indicate correlation between inverted comma as there is no indication that the numbers proposed actually form a positively define matrix.

2016-01-16

Books - Rant on books

My LinkedIn headline could have been

Compulsive buyer, reader and collector of books.

I built (or more exactly I designed it and had it built for me) a two storey library. It roughly six meters wide and five meters high. It contains thousand of books and weight several tons. All that to say I love books, I’m addicted to them.

My addiction took a new turn recently when I published my first full book. Now I’m also addicted to writing. You may have seen my announcement for a new book. The new headline is

Compulsive buyer, reader, collector and writer of books.

Being in love and addicted, you could expect subjectivity from me. Nevertheless find below my objective and dispassionate opinion about books.

Books are expensive, too expensive, … if you want or need to buy them. For a hardcover book you pay 60 GBP, 15% of it is the actual cost of the physical book, the rest is immaterial: author royalties and services (editor and retailer). And believe me, the author royalties will not make him (me in this case) rich. Nevertheless, this is a lot of money to fill one centimeter of library shelf.

On the other side the books are not only ornaments for bookshelf; they also have contents. For technical books, the saying is that one page of content takes one day to write. I learned the hard way that this is a realistic estimate. The content is, for some books, a very efficient access to real expertise, to the deep thoughts of enthusiast people about their trade. You don’t even have to listen to the annoying and arrogant authors, you can read directly their thoughts at your pace, in the sequence you want and where you want. Obviously you have to make the effort to read and decipher what the author really means, but nobody said that though reading was easy.

You have 200 days of expert thoughts for GBP 60. Maybe you don’t really need the 200 days, only 100 of them are useful to you, but is that expensive or cheap? In the finance industry, what else can you have for 60 quid? You can hire a senior consultant, of untested expertise related to your problem, for 2400 quid a day (I round the numbers to make my computation easy), i.e. you can speak to him for 12 minutes for the price of the book. Your book value is gone by the time you have exchanged business cards. You can also have lunch with business acquaintances. A nice lunch in London for two, another 180 quid or three books. By the end of the starters you better have got a lot of expert opinion on your subject of interest, not only gossips. Maybe you can travel to see a client, a short flight, an hotel and a couple of lunches, that is 600 quid. I hope your clients will bring you a lot of information about your industry, enough to fill ten books, and that he will write the back-to-office reports for you.

How come that the banking industry, supposed to be full of expert at detecting arbitrage, is not full of books. If I was board member at any bank, I would ask: where is the library? Do you buy all the good books that come out? If you have not bought and read all of them, how do you know that they are not the best arbitrage opportunity in the market.

Books are cheap, too cheap … if you buy them and use them correctly. They almost look like open source software (by the way, check Strata). You get them and you can use all the ideas in them without paying anything to the people that have spend days, months, and maybe years developing them. You just have to chose wisely which ones you want to use the content of and which ones stay as ornament on your bookshelf.

Buy books, read them, love them or hate them, whatever they deserve. If you like a book, contact the author; maybe just to say that you like the book, maybe to invite him for a business lunch, maybe to invite him for to speak at a seminar, maybe to hire him as a consultant from whom you have just read a 200 pages CV or maybe to play a round of golf (I now this is completely unrelated, but I appreciate a round of golf over the week-end and my golf ball offer is still on).

I’m a compulsive buyer, reader and collector of books (I know I repeat myself). But I’m even happier when I receive a book. Not that I’m stingy and does not want to buy them. What I like in receiving a book is that often I receive books that I would not buy myself … and I love it. That is the important part, discovering something new that I would not have come across by myself. After my call for a round of golf, here is my call for books. If you have written a book, would like me to read it (or simply use it as an ornament to my library), don’t hesitate to contact me. I will be happy to barter a “Multi-curve Framework” copy for a different book; the offer is also valid for PhD thesis.  Of course you would need to pitch your book to me, but this is part of the preliminaries. Offer valid within the limits of the available stock (just to protect myself in case millions of people are interested in swapping their book for mine and I don’t have enough space in my library to store all those books).

Conclusion:
Buy books, they are cheap.
Offer me books, I love it (and them).

Forthcoming book on Algorithmic Differentiation!

The first draft of my forthcoming book on Algorithmic Differentiation has been send to the editor.

The optimistic expectation is that in will be available in print in May 2016.

Only four months to wait!