Serving the Quantitative Finance Community

 
User avatar
Stylz
Topic Author
Posts: 1
Joined: May 18th, 2005, 12:14 pm

CDX nonstandard tranches

March 13th, 2007, 2:04 pm

Questions for the credit derivatives gurus out there ... 1. if pricing a nonstandard tranche like 5-6, is it common to linearly interpolate 0-3 BC and 0-7 BC in order to get 0-5 and 0-6 correlations?2. How about 1-2, is it common to lock in 0-3 BC and use this as compound correlation, or do we interpolate between 0 and 0-3 BC to get 0-1 and 0-2 correlations?Rgds
 
User avatar
StructCred
Posts: 0
Joined: February 1st, 2007, 1:59 pm

CDX nonstandard tranches

March 13th, 2007, 5:43 pm

For 5-6, linear interpolation of bCorrs is a bad idea. For mezz tranche bCorr model is very sensitive to corr slope. If you linearly interpolate the bCorrs, the slope you get is pretty much random. As a result you can get nonsense numbers. A good example would be pricing 9-10 and 10-11 tranches. If you linearly interpolate bCorrs, you will often get tighter ATM spread for 9-10 than for 10-11. This is of course due to 7-10 slope being steeper than 10-15. When you're trading a non-standard tranche, you're trading a different part of the loss distribution, so the goal should be to get the correct EL for the 5-6 tranche. What you want to do is see the EL for 0-3 and 0-7. Then you would want to come up with a method to interpolate across the loss distribution.For 1-2 you would effectively need to extrapolate. Again the goal is maintain a sensible loss distribution, although without any quotes it may be hard to judge what it should be (spread distribution of index constituents may give some idea).
 
User avatar
Stylz
Topic Author
Posts: 1
Joined: May 18th, 2005, 12:14 pm

CDX nonstandard tranches

March 14th, 2007, 12:07 pm

Hi StructCred,Yes, I see what you mean on this. Very interesting idea and I can see why the pricing goes wrong. I will shift my focus.Is there any sort of consensus on the method chosen to interpolate the loss distribution?Thanks again for your help.
 
User avatar
sacevoy
Posts: 7
Joined: November 16th, 2006, 5:24 pm

CDX nonstandard tranches

March 15th, 2007, 3:58 pm

cubic spline .... base correlations is not an arbitrage free paradigm
 
User avatar
vespaGL150
Posts: 3
Joined: October 25th, 2004, 4:53 am

CDX nonstandard tranches

March 16th, 2007, 5:42 am

To avoid arbitrage on non-standard tranches of the traded indices is it fair to say that most people are using a calibrated 'model' such as random factor loading, local correlation stochastic correlation (Gaussian mixture) or other (?) approaches rather than relying on base correlation? Any consensus on the preferred internally consistent model? My understanding is that none of the aforementioned have consistent calibration parameters across portfolios (i.e. iTraxx, CDX etc) and maturities (3,5,7,10yr). Are there any models out there that manage to achieve this and have calibration parameters that are also fairly stable with time?How are people currently pricing bespokes? Still reverting to Base Correlation and applying some sort of mapping based on their preferred risk metric (moneyness, tranche exhaustion probability, tranche EL/ portfolio EL fraction, other)? Any consensus on the 'preferred' approach?Finally on base correlation rather than apply some spline interpolation scheme to the finite points of the skew, which may still lead to negative probability densities, has anyone looked at apply splines to the implied cumulative loss distributions as a function of equity tranche detachment?
 
User avatar
Stylz
Topic Author
Posts: 1
Joined: May 18th, 2005, 12:14 pm

CDX nonstandard tranches

March 16th, 2007, 2:25 pm

Hi vespa,Actually with respect to your last comment ... I had thought this was what was being represented in the previous post.So to use an example ... I have a 5yr deal with 40 pay dates and 125 names. I have a 125x40 grid of cumulative default probabilities (and hence expected loss distributions). So now I have a vector of 40 numbers which represent the expected notional outstanding for the tranche at each pay date. If I have this for 0-3, 0-7, 0-10, 0-15, 0-30. If I want it for 0-X, I can spline this (separate spline for each pay date) and get a new strip of expected outstanding tranche notionals ... then translate this into a value.Is this what you meant?
 
User avatar
StructCred
Posts: 0
Joined: February 1st, 2007, 1:59 pm

CDX nonstandard tranches

March 17th, 2007, 7:08 am

Stylz: Yes, this approach looks good to me. One other thing you may want to consider is using all quoted maturities for calibrating this grid. I.e. If you are building a 10 year loss grid and have standard tranches quoted for 5,7 and 10 years. Using these you can calculate the expected losses per name for 5 years, then calculate conditional 5-7 | 5 losses etc. This will give you a model consistent across detach points as well as across maturities.
 
User avatar
Money
Posts: 2
Joined: September 6th, 2002, 4:00 pm

CDX nonstandard tranches

March 18th, 2007, 1:31 am

Any slides/paper on this topic from industyr practioners ?
 
User avatar
vespaGL150
Posts: 3
Joined: October 25th, 2004, 4:53 am

CDX nonstandard tranches

March 19th, 2007, 2:01 am

Stylz - quite right, sorry very sloppy, I didn't read through the previous post(s) properly. What I have been tinkering with is very similar to what you outline. In terms of the base correlation skew we have the 5 market quotes corresponding to 5 detachment points. An interpolation on the skew for arbitrary detachment points can lead to problems. Rather than make the interpolation in the base skew space do this instead in the portfolio cumulative loss distribution space.Taking a 5yr trade with quarterly payments there would be 20 loss distributions. After 'calibrating' to the market prices we have cumulative probs for loss levels corresponding to the detachment points that relate to the 5 market prices for each of the 20 loss distributions. A spline is fitted through each loss distributions in time, i.e. each of the 20 payment periods. For any arbitrary detachment point the cumulative probs at each of the payment periods can be deduced.An embedded inconsistency here for me being that for a given loss level or detachment point, i.e. 3%, 7%, 10%, 15% & 30% corresponding to the 5 market quotes, the base correlation corresponding to that detachment point is used for each of the 20 loss distribution values calculated in time despite the fact that base correlation is acknowledged to be a function of time, i.e. 5yr 7yr and 10yr skews. This is where StructCred's conditional idea comes in which I guess is similar in spirit to an approach suggested by Citigroup where they apply it in the base correlation skew space with maturity rather than at the loss distribution level.Still keen to get feedback on consensus on the use of 'actual' models (local correlation ~ random factor loading, stochastic correlation, other) are other houses actively using them for non-standard tranches of the traded indices? Are there other models out there that calibrate well across markets, maturities and have relatively stable calibration parameters?
 
User avatar
PKKoop
Posts: 0
Joined: June 24th, 2005, 1:05 pm

CDX nonstandard tranches

March 29th, 2007, 2:18 pm

I hope my questions are not silly or naive; this is not my area.First, regarding the grid of loss distributions described by Stylz, why are there 40 pay dates for a 5Y deal? I had supposed that these were commonly quarterly pay.Second, is this also how you would mark off-market deals? I mean, the CDX only rolls twice a year. Would you price a 2-month old 3-7 tranche with a 7Y original maturity by interpolating the EL of the 3-7 tranches with current maturity 5Y and 7Y if the index was the same version? Smoothly, with a spline?
 
User avatar
StructCred
Posts: 0
Joined: February 1st, 2007, 1:59 pm

CDX nonstandard tranches

March 30th, 2007, 4:23 pm

PKKoop: The 40 pay dates are related to the quarterly CDS payments. Since CDO trances pay coupon on these dates you need to compute the probabilities of each coupon being paid. As far as off the run series go. In the IG world, off-the-run standard tranches are fairly well quotes (at least for the last few series). As such you can build a loss distribution for each series and price non-standard tranches as necessary. If you don't have any quotes for the particular series, you can map the on-the-run series to off-the-run using one of bespoke mapping methodologies. You should account for different in maturities - I.e. if you're looking at CDX5 tranche which was originally 10 years, the correlations should probably be somewhere between 7 ad 10 year CDX8 quotes. That being said, interpolating correlations directly is iffy, so you would probably want to price 8.5Y CDX8 tranches off of the loss distribution discusses above. Calculate the implied bCorr skew and map that to the CDX5 pool.
 
User avatar
PKKoop
Posts: 0
Joined: June 24th, 2005, 1:05 pm

CDX nonstandard tranches

April 2nd, 2007, 6:41 pm

Thank you very much for your kindness and your patience, StructCred.I had not realized how standardized index tranches are - that mapping from an old off the run series is the only situation requiring maturity interpolation.I still don't follow where 40 pay dates come from. 5 x 4 = 20, and while one must calculate default, accrual and regular coupon payments, the respective probabilities required are complements, are they not?One new question: does the index as a whole generally trade at the level implied by the underlying single-name CDSs, or is there usually a basis?
 
User avatar
StructCred
Posts: 0
Joined: February 1st, 2007, 1:59 pm

CDX nonstandard tranches

April 2nd, 2007, 7:26 pm

40 pay dates corresponds to 10Y tranches. Most standard tranches are quoted for 5,7 and 10Y maturities. As far as maturity interpolation goes, it's useful for other things as well - bespokes for example. While I've never seen these trade, in theory one could imagine a tranche on an on-the-run index with a non-standard maturity. As far as the indices go, there is usually basis to the theoretical value implied by the single name CDS.