Page **2** of **2**

### Re: What could go possibly wrong ...

Posted: **October 4th, 2017, 8:03 pm**

by **Cuchulainn**

Nice figure (although it should extend all the way to four of the vertices of that cube).

**D I think Alan can answer this**

Yes, [-0.5, 0.5]^3 is entirely safe in that it corresponds to acos(0.50) = 60° to acos(-0.5) = 120° angles between the original data vectors. But any correlations outside [-0.5,0.5] can induce constraints on the other rhos.

**D is there something sacred in cos() or are there others?**

Expanding the algorithm would probably mean either:

1) using the equation for the determinant of the NxN.

**D How big is N? Do determinants work these days? Last time I did determinants was at school, maybe I missed something.**

2) doing an eigendecomposition of the matrix with a sensitivity analysis WRT the values and find the perturbations of the values that bring all the negative-valued modes to zero.

**D negative e-values -> 0?**

3) Expressing the constraint in terms of acos(rho) angles (and distributions of those angles) with combinatoric logic for which chains of rhos are mutually incompatible.

**D cos() is a wiggly function going to eternity; what about [$]\rho = tanh(\theta)[$]?**

[font={defaultattr}] [/font]

### Re: What could go possibly wrong ...

Posted: **October 4th, 2017, 8:18 pm**

by **outrun**

The two industry standards are 1) set negative eigenvalues to zero 2) find the closest valid correlation matrix (Frobenius norm) using a Newton solver.

### Re: What could go possibly wrong ...

Posted: **October 5th, 2017, 8:13 pm**

by **Cuchulainn**

* Newton solver.*

Not a root finder, but a least-squares optimiser, I assume that is what is meant.

### Re: What could go possibly wrong ...

Posted: **October 5th, 2017, 10:25 pm**

by **Traden4Alpha**

**acos(rho)**

acos() is not sacred but it is extremely useful because acos(rho_ij) is the angle between data vectors i and j that created the covariance matrix and correlation matrix. It enables checking (and potentially correcting) elements in a matrix of rho values to be consistent with the geometry of the data. For example, given rho_ij, and rho_ik, |acos(rho_ij)-acos(rho_jk)| < acos(rho_ik) < |acos(rho_ij)+acos(rho_jk)| where all operators are variants needed to handle the wrapping of angles WRT [-π,+π]. It's the triangle inequality in spherical coordinates. One can develop similar constraints on longer circuits of rho but it involves more combinatoric logic about the comparisons and interval wrapping effects.

What's nice is that these constraints on angle are somewhat "local" in that it may be quite easy to enable conclusions that a particular rho is the likely cause of the trouble (the percentage of violated inequalities is higher for a particular i,j) which can induce better estimation efforts for that particular rho.

**Determinants**

It's just a lot of crank turning although it does scale poorly to large N. I'd not be surprised if the determinant of a covariance or correlation matrix is a bit easier due to symmetry but I've never looked into it.

**negative e-values -> 0**

Yep! It's just what outrun said.

**cos() is a wiggly function going to eternity**

True, but in this case it's restricted to [-π,+π] in being angles between vectors in spherical coordinates.

### Re: What could go possibly wrong ...

Posted: **October 8th, 2017, 1:33 pm**

by **Cuchulainn**

**Determinants**

*It's just a lot of crank turning although it does scale poorly to large N. I'd not be surprised if the determinant of a covariance or correlation matrix is a bit easier due to symmetry but I've never looked into it.*

If the eigenvalues of A are known then det(A) = product of eigenvalues?

### Re: What could go possibly wrong ...

Posted: **October 8th, 2017, 5:23 pm**

by **outrun**

Yes, determinant is the "volume" spanned by the eigenvectors.

### Re: What could go possibly wrong ...

Posted: **October 9th, 2017, 9:56 am**

by **Cuchulainn**

Determinants were widely discussed in the 19th century. Are they still used (I have never seen them)??

http://www.mathunion.org/ICM/ICM1974.2/ ... 70.ocr.pdf
In principle, it is possible to recursively compute determinants using cofactors, but the efficiency would not be great I suspect.

### Re: What could go possibly wrong ...

Posted: **October 9th, 2017, 1:46 pm**

by **Traden4Alpha**

The choice between det(A) or eigenvectors may be a matter of deciding which cranking-turning algorithms do you want to use.

What's interesting to me is the potential use of determinants of submatrices of A to localize the erroneous rho values.

### Re: What could go possibly wrong ...

Posted: **October 9th, 2017, 2:55 pm**

by **outrun**

Thats not going to work, rotating the eigenvectors doesn't affect it's invalidity or determinant, but *does* affect any submatrix you want it to, so it's very easy to generate counterexamples of whatever you come up with!

### Re: What could go possibly wrong ...

Posted: **October 11th, 2017, 10:58 am**

by **drmwc1**

Here's a paper on getting the nearest correlation matrix in n-dimensions:

http://citeseerx.ist.psu.edu/viewdoc/do ... 1&type=pdf
I think the alternating projection method is in the NAG library. (Project to closest symmetric matrix with 1s on main diagonlal; project to PSD matrix; iterate).