Serving the Quantitative Finance Community

 
User avatar
yugmorf2
Topic Author
Posts: 0
Joined: November 21st, 2010, 5:18 pm

positive definite = correlation matrix?

November 27th, 2011, 10:55 am

Looking for some advice on valid correlation matrix conditions;1) A positive definite correlation matrix is a necessary condition for the correlations to be valid, but i'm wondering if ALL pd matrices are also valid correlation matrices.2) Can anyone recommend a technique for adjusting a correlation matrix to make it positive definite, which also takes into account the degree of confidence one has about each of the correlation matrix elements? Thanks
 
User avatar
bwarren
Posts: 0
Joined: February 18th, 2011, 10:44 pm

positive definite = correlation matrix?

November 27th, 2011, 3:31 pm

A covariance matrix must be positive-semidefinite and symmetric. In addition, a correlation matrix will only have values with absolute value <= 1 and 1's on the diagonal.
 
User avatar
yugmorf2
Topic Author
Posts: 0
Joined: November 21st, 2010, 5:18 pm

positive definite = correlation matrix?

November 28th, 2011, 12:28 am

Yes, that makes sense. Can we say that all such matrixes - psd, symmetric, and values less than 1 (=1 on diagonal) - are valid correlation matrices? And then, let's say there are certain elements of the correlation matrix i'm more sure about than others (say due to sparse data in the later), how to make a sensible adjustment that takes account of the level of confidence one has about each item?
 
User avatar
MattF
Posts: 6
Joined: March 14th, 2003, 7:15 pm

positive definite = correlation matrix?

November 28th, 2011, 11:35 am

Look at this thread Cholesky decomposition
 
User avatar
crmorcom
Posts: 0
Joined: April 9th, 2008, 4:09 pm

positive definite = correlation matrix?

November 28th, 2011, 2:52 pm

There are a few ways you can go about this. If you know the variance-covariance matrix, of course, you know the correlation matrix, so I'm not going to say much more about correlations directly. All these methods enforce positive-definiteness of the variance-covariance matrix.1) If your data observations are multivariate Gaussian, N(m, V), and you have no missing observations then, if you can express your priors on the inverse variance-covariance matrix, V^{-1}, as a Wishart distribution, you are OK. This is because, the Bayesian posterior distribution of V^{-1} is also Wishart. Have a look at http://en.wikipedia.org/wiki/Wishart_distribution, and also read about conjugate priors in Bayesian statistics.2) If you know (or assume) the likelihood of your data, you can estimate the variance-covariance matrix using maximum-likelihood estimation. This way, you can impose whichever restrictions you like, quite explicitly. This method is not particularly good for imposing vague restrictions, but if you, say, want to fix \rho_{12} to have a particular value, you can do that easily and, furthermore, can use liklihood ratio tests to tell how much your restriction affects things.3) As above, you have to know the likelihood. You can assume completely general priors on V or, if you like, P, the correlation matrix. If you know the posterior distribution (prior.likelihood), you can use Markov chain Monte-Carlo simulation (MCMC) to estimate the posterior distribution of your parameters in almost complete generality.