October 2nd, 2015, 3:01 pm
QuoteOriginally posted by: CuchulainnQuoteOriginally posted by: Traden4AlphaQuoteOriginally posted by: barnyQuoteOriginally posted by: outrunThat not the forefront really. :DThe most basic approach is to price an option with MC and then price it again with MC but then do it for S+h and S-h to get the delta/gamma. That's what my 5 year old son would do.Doing that you'll see that there is a lot of sample noise, and that it can be reduced by re-using the sample random numbers so that scenarios will be similar. ..That's what I've seen at risk departments at banks. Banks also use the same technique to reduce day-to-day variation in the VAR. With limited number of MC paths the sample noise will be big, and since VAR is just an imprecise measure or risk anyways with little absolute meaning, people are more interest in changes in VAR.But as we've seen the "bump and revalue" is essentially wrong. It depends in a non-trivial way on your delta S, too small causes problems and too big also. And then there are the problems that Cuch mentions written about by Glasserman, Jaeckel etc.Indeed! The deeper problem is the use of a discrete process to estimate a continuum property. The same problem is seen in problems with roundoff errors in which the discrete set of values of computing floating point numbers fail to represent the continuum of the real numbers.You can experience the same 'theoretical' numerical greek problems even in the absence of roundoff errors. Important as they are, they are for downstream implementation. What I am saying is that there is discrete (in numerical analysis) and discrete (in a computer). They are mutually orthogonal issues in the main.Yes, they are orthogonal sources of unexpected error with difference error properties. Yet as N->∞, they merge. Do an MC with floats and >2^32 samples (not an unusual scenario of GPUs) and one might be inclined to use a very small h that hits the roundoff resolution limit of float in the domain and range of the function.