Does anyone know of any papers discussing a simplified approach to CMS Spread Option pricing for giving "ball-park" prices?In particular I'm wondering if it is possible to simply use the 2 points in isolation without term structure and just take vols and correlations between the two points ?Thanks.

I am skeptical... I think that even with the complex approach and if you are very lucky you only get ball-park prices: the quantities required to price CMS spread options (the full smile to +infinity for each swaption with expiry date the same as the one of each caplet to compute the convexity adjustments, the term structure of correlations) are too far from being observable to ever hope to have more than a ball-park price.

Hi analyst77,When you price a single CMS option, you need to do a change of measure from an annuity measure to a forward measure. If you want to price a CMS spread option, you need to do two such changes of measure. Once you do that you will have one-dimensional distributions, and you need to couple them using some copula. After you get your 2-dim distribution, you price your payoff. Two main sources of risk here: - the change of measure, just like in the 1-dim case- the copula assumptionBest of luck.

http://ssrn.com/abstract=1520775may be of assistance

Thank you for your helpful replies folks.Also Mark, thank you (and love your books btw)

Hello Mark,I just had a chance to go through the paper you sent and it's exactly what I'm looking for.I implemented another model that was good on short term spread options but then had very significant divergence in medium and long term.I'm not familiar with research in the area and I'm wondering if you could help clear up some sources of confusion for me. To a general practicing quant I am sure these are obvious points, but they are eluding me. Perhaps it'll be helpful to other newbies as well.A) In (4.4) How are the alphas determined. ? It's noted they are displaced diff coefficients but I don't follow how they are determinedB) (4.4) How is mu10 determined? C) In footnote on page 11 it is noted preserving p10,j is b10,1=1 and b10,2=0. Since this is a 10year,2year spread is that a valid assumption?I know you include C++ code with your books, I'm wondering if you would happen to have code for this that I could look at? That would make things crystal clear.

Last edited by analyst77 on March 15th, 2011, 11:00 pm, edited 1 time in total.

..edit..

Last edited by analyst77 on March 15th, 2011, 11:00 pm, edited 1 time in total.

A) they are a calibration parameter, you pick them to match the swaption smile. eg to get the same skew at the moneyB)mu10 is the no-arbitrage drift of SR10 approximated using predictor-corrector as in section 2. see also the joshi-liesch paper. C) not sure what you are asking. We make the choice of preserving the correlation between the ten-year rate and all the other rates. In particular, the correlation between the 10 and 2 year rate which is all that really matters is preserved.

Thank you for your reply and thank you for bearing with me. Patience is required for someone as new as I am . ;-)A) I assume what we want is to select alphas such that (4.4) and (4.5) equal to market SR? if so alphas would need to vary for each abscissa under gauss quadrature expansion, wouldn't they?b) I will check this out thank youC) Yes a key feature of the paper is you showing skew doesn't matter much, I had mad the erroneous assumption b10,2 represents corr b/w 10 and 2 but is obviously not. My mistake.

re A, the model is d(X+\alpha) = (X+\alpha) \sigma dW_tunder this model, E(X_T) = X_0\sigma determines the volatility\alpha affects the skew and the implied volatilitywe pick X_0 and then pick \sigma and \alpha to match the smile as well as we can. In practical terms that means the ATM volatility and skew. They are constants within the model.

- QuantOption
**Posts:**269**Joined:**

QuoteOriginally posted by: mj More mathematical finance is on its way!fantastic! just spotted this, any more details?ps: sorry for hijacking the thread

updates on my facebook page. Currently 431 pages. I am concentrating on cleaning existing chapters at the moment. However, I may write two or three more.

GZIP: On