February 8th, 2003, 5:49 pm
Normaly, if i wanted to do some monte-carlo simulation, i'd generate a bunch of variates with a given volatility and correlation. I would generate them by taking a choleski decomposition of the desired covariance matrix, and multiply that by some mean zero standard dev one random numbers that i had generated by box-muellernow i understand that sobol sequences "fill in" the unit hyper cube more efficiently than something like box-mueller, but my instincts tellme that i will violate "some assumption" of monte-carlo by simply taking a sobol sequence and multiplying it by the cholseki decomp of the covariance matrix any insight [as to how to work around what i might mistakenly perceive as a problem] gratefully apreciated
Last edited by
tonyc on February 7th, 2003, 11:00 pm, edited 1 time in total.