May 7th, 2009, 8:43 pm
Recently, I have encountered the following question: Given a diagonalizable square matrix A and a noise matrix W,find conditions on A and W such that for each eigenvalue L of A,there is an eigenvalue M of (A+W) such that abs(L-M) < K for a given K. I just wondering, does anyone have thoughts how to approach this problem?Thanks