Eigen Values

Last Updated:

Eigenvalues are everywhere, from machine learning, to structural mechanics, to dynamics, to linear algebra. So just exactly what are these strange numbers?

Let’s take an N by N matrix and call it A. We’ll also define a length N vector v. Usually, multiplying a vector by a matrix (i.e. a rotation matrix) will change the direction of the original vector. However, if the product of matrix A and vector v is still in the same direction of v (but scaled by a factor λ) then we can say that v is an eigenvector of A, and that λ is the eigen value associated with that eigenvector. Mathematically:

\[Av=\ \lambda v\] \[\left(A-\lambda I\right)v=0\] \[determinant\left|A-\lambda I\right|=0\]

Solving the above equation, we will find up to N different eigenvalues, and up to N different eigenvectors.

This irrotational property of eigenvectors underlies their significance: eigenvectors carry information about the dimensional size (i.e. rank) and direction of matrices.

Eigen Values | Notes