Serving the Quantitative Finance Community

 
User avatar
nuclph
Topic Author
Posts: 0
Joined: July 7th, 2006, 12:26 pm

Matrix A and a noise matrix W

May 7th, 2009, 8:43 pm

Recently, I have encountered the following question: Given a diagonalizable square matrix A and a noise matrix W,find conditions on A and W such that for each eigenvalue L of A,there is an eigenvalue M of (A+W) such that abs(L-M) < K for a given K. I just wondering, does anyone have thoughts how to approach this problem?Thanks