For standard normals, make two draws Z1 and Z2 from your favorite (random normal) generator.Then, use Z1 and Z3 = rho Z1 + Sqrt ( 1 - rho^2 ) Z2,where rho is the desired correlation.

If you want n random normals with correlation matrix C then find a matrix A such that C = AA^{t}.Now draw n independent normals W= (W_1,... W_n) and then putZ = AW.NB there lots of such A s. MJ

most widely used being the cholesky decomposition where A is a lower triangular matrix with positive diagonal elements.someone on the forum mentioned a link which explicitly shows how to perform cholesky decomposition: http://ikpe1101.ikp.kfa-juelich.de/brie ... de33.htmld.

Cholesky is indeed one popular method using triangular decompositionhowever many use a spectral method using matrix Diagonalization and argue the results are betterPJ is the expert ...

I was taught Cholesky in grad school, but Reza wrote "spectral method using matrix Diagonalization and argue the results are better"I have no idea what this is, PJ or Reza, please explain....please?

again the expert id PJbut if you just replace the decmposition of the covariance matrix into triangular matrices withdiagonalization you could do the same thing, no?instead ofx = Xy= rX + sqrt(1-r^2) Yyou would have something likex= aX+bYy= cX+dY

James -I'm guessing the spectral approach is as follows: Write M = ADA^-1. Take sqrts of the diagonal elements and call that S i.e. D = SS. Then M = ASA^-1 . ASA^-1. If A is orthonormal by construction (i.e. it is orthogonal by construction and we took the 'ovals' out by using D), then A^-1 = A^T. S=S^T also, so if C = ASA^-1 then M=CC^T as well.

i am guessing that the performance is improved because the A's are orthonormal and hence behave well when multiplied by a vector of normal random variates... all the magnification / compression is inside S, which is safely diagonal.

Thanks Reza & kr, I am going to try this, and will be back for more advice if i screw it up (p=100%).

One issue is the performance of low-discrepancy numbers.If you take low discrepancy numbers that work better in low dimensions than in high ones (this is true of all low-discrepancy numbers of which I am aware,) then you want the low dimensions to do more work. Spectral pseudo-square rooting is one way to achieve this. For ordinary pseudo-randoms it shouldn't make any difference to the convergence rate.Additionally, some people prefer the spectral method for numerical stability reasons. The method is writeC = PDP^{t}with P orthogonal and D diagonal. Let D^{1/2} be D with the elements square-rooted. Put A = PD^{1/2}.MJ

- WaaghBakri
**Posts:**732**Joined:**

A naive question ...... to create n correlated or independent RV's, must we worry about non-sequential drawing that results when we use the same generator? Or equivalently, if I choose every nth. sample from a single stream output from a random number generator, will I ruin the random number generators properties?

The whole discussion on this topic so far has been concerned with normal distributions.More generally, you might want to simulate correlated numbers from an arbitrary collection of marginal distributions.This is the basic problem that insurance companies face all the time: estimating their total p+l distribution, when the individual parts of their business (different classes of insurance, investments, and so on) are correlated, and have weird and wonderful marginal distributions. The only 'derivatives' application I know is for modelling portfolios of weather derivatives, some of which may have very non-normal distributions (e.g. the number of days it snows in winter).Even given the correlations, there is no unique answer, unlike for the normal case.The standard method used in practice is due to Iman and Connover.My understanding of it is that it works as follows:1)calculate rank correlations2)convert them to linear correlations using a cunning formula3)simulate (using Choleski or SVD) from a multivariate normal with these correlations4)convert the normally distributed simulations to the right marginal distributions using the CDF of the normal distribution and the inverse CDF of the marginal distribution.To go further than this, you could start looking at copulas.Sloth

I think that to generate correlated multivariate normal random variables, one would do the Cholesky or Spectral method on the CORRELATION matrix. Some people here talk about the COVARIANCE matrix.Will using the Covariance matrix (X*sqrt(t), where X is N(0,1)) be same as using the correlation matrix and then multiplying by the individual volatilities (sigma*Z*sqrt(t))

why do two mults when you can get away with one.MJ

GZIP: On