gusl: (Default)
[personal profile] gusl
Here's something I should have learned in Linear Algebra class, if it is true:

For every linear transformation T that is not projecting into fewer dimensions, there exists an *orthogonal* basis B, such that T takes each basis vector to a multiple of itself, i.e. forall b_i in B, T(b_i) = lambda_i (b_i). In this basis, T is represented by a diagonal matrix, namely the one containing lambda_i. (I can't prove this)

If this is right, there should be something called the "orthogonal eigenbasis algorithm", but I can find no such thing... (doesn't PCA do this?)

(Also, a linear combination of two eigenvectors is an eigenvector iff the two eigenvalues are the same)

I trust that my geeky readership will correct me if this is wrong.

I should read about characteristic polynomials.
(will be screened)
(will be screened if not validated)
If you don't have an account you can create one now.
HTML doesn't work in the subject.
More info about formatting

February 2020

S M T W T F S
      1
2345678
9101112131415
16171819202122
23242526272829

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags