eigen decomposition
Feb. 12th, 2007 12:07 amHere's something I should have learned in Linear Algebra class, if it is true:
For every linear transformation T that is not projecting into fewer dimensions, there exists an *orthogonal* basis B, such that T takes each basis vector to a multiple of itself, i.e. forall b_i in B, T(b_i) = lambda_i (b_i). In this basis, T is represented by a diagonal matrix, namely the one containing lambda_i. (I can't prove this)
If this is right, there should be something called the "orthogonal eigenbasis algorithm", but I can find no such thing... (doesn't PCA do this?)
(Also, a linear combination of two eigenvectors is an eigenvector iff the two eigenvalues are the same)
I trust that my geeky readership will correct me if this is wrong.
I should read about characteristic polynomials.
For every linear transformation T that is not projecting into fewer dimensions, there exists an *orthogonal* basis B, such that T takes each basis vector to a multiple of itself, i.e. forall b_i in B, T(b_i) = lambda_i (b_i). In this basis, T is represented by a diagonal matrix, namely the one containing lambda_i. (I can't prove this)
If this is right, there should be something called the "orthogonal eigenbasis algorithm", but I can find no such thing... (doesn't PCA do this?)
(Also, a linear combination of two eigenvectors is an eigenvector iff the two eigenvalues are the same)
I trust that my geeky readership will correct me if this is wrong.
I should read about characteristic polynomials.