July 1st, 2015, 1:24 pm
QuoteOriginally posted by: CuchulainnQuoteOriginally posted by: emacQuoteOriginally posted by: CuchulainnQuoteOriginally posted by: emacQuoteOriginally posted by: mutleyI might be misreading you but doesn't that require you to know the variance of the payoff i.e. we want to know the price of the derivative has converged and you are telling me that M is a function of Var(X) - X being the price of the derivative on a given sample?Yes. Although, an upper bound will do. In practice, an estimate of the variance would also suffice.I think this is the part Cuch takes issue with as well. Not sure why, though.So, I am not alone:) I would like to see the implementation of this approach. How does it work, really?Not all maths is constructive.This is just a way to pick the number of MC paths you need to get a given error with a given probability. It is connstructive in the sense that you pick M based on your inputs ([$]\epsilon[$] and the probability you pick) and things you can bound, or estimate (the variance).That's clear. As I have never used this approach I do not see how to compute M.Has anyone written an article on this? I am open to ideas + a bit of healthy scepticism.Pick an error threshold [$]\epsilon[$] and a probability [$]p[$]. Set [$]M = \frac{Var(X)}{p \, \epsilon^2}[$] then, by my previous calculation, [$]P(|X_M-\mu|>\epsilon)<p[$]. If you do not know [$]Var(X)[$], you can use an upper bound.For example supposed [$]B[$] is a Brownian motion and [$] X=|B_T| [$]. Then, [$]Var(X) \leq E(B_T^2)=T[$], so choose say [$]\epsilon=10^{-2}[$] and [$]p=10^{-2}[$] then choosing[$]M = 10^6 \, T[$] guarantees that with probability 0.99 your MC estimate is within [$]10^{-2}[$] of the true value.Chebyshev is possibly a bit crude and sharper inequalities might allow you to choose lower [$]M[$]. Of course, in practice you also may be able to take [$]M[$] much smaller.