December 18th, 2013, 8:27 am
Suppose I have a GBM without drift [$]dS_t = S_t \sigma dW_t[$] and define [$]M = max_{(0\leq t\leq T)}S_t[$] as its maximum. I look for the expected spot conditional on [$]S_t[$] having never reached some barrier [$]B>S_0[$], i.e. [$]E(S_T|M < B)[$]. My thinking was that I integrate over the joint distribution of a Brownian Motion with drift [$]\hat{W}[$] (the drift coming from the convexity adjustment [$]-0.5\sigma^2t[$]) and its maximum M: [$]f_{M,\hat{W}}(m,w)[$]. I have set [$]b=\frac{log(B/S_0)}{\sigma}[$] and compute [$]E(S_T|M <B) = \int^b_0 \int^b_w S_0 e^{\sigma w} f_{M,\hat{W}}(m,w) dmdw + \int^0_{-\infty} \int^b_0 S_0 e^{\sigma w} f_{M,\hat{W}}(m,w) dmdw[$].The whole thing can e.g. be found in Shreve, Stochastic Calculus for Finance II, p. 295ff. While I can perfectly solve above integral, it gives me odd result that deviate from a simulated expectation.Many thanks for your help/comments.