34. Eigenspaces

34. Eigenspaces

In all the examples we've seen so far, the eigenvectors have all had a free variable in them. For example, in the last video, we found the eigenvectors for the matrix A = ( 3 2 5 2 3 - 1 2 - 3 2 - 3 1 1 2 ) to be:

For the matrix ( 2 1 1 1 ) we found the eigenvalue 3 ± 5 2 had eigenvectors ( x - 1 ± 5 2 x ) . All of these have the free parameter x .

This is a general fact:

Lemma:

If v is an eigenvector of A with eigenvalue λ ("a λ -eigenvector") then so is k v for any k 𝐂 .

A ( k v ) = k A v = k λ v = λ ( k v ) .

So for example, the vectors ( x - x x ) are all just rescalings of ( 1 - 1 1 ) . Indeed, people often say things like "the eigenvector is ( 1 - 1 1 ) ", when they mean "the eigenvectors are all the rescalings of ( 1 - 1 1 ) ". If you write this kind of thing in your answers, that's fine.

Example:

Suppose we have the matrix I = ( 1 0 0 1 ) . The characteristic polynomial is det ( I - λ I ) = det ( 1 - λ 0 0 1 - λ ) = ( 1 - λ ) 2 , so λ = 1 is the only eigenvalue. Any vector v satisfies I v = v , so any vector ( x y ) is a 1 -eigenvector. This has two free parameters, so it is an eigenplane, not just an eigenline: there is a whole plane of eigenvectors for the same eigenvalue.

Theorem:

The set of eigenvectors with eigenvalue λ form a (complex) subspace of 𝐂 n (i.e. closed under complex rescalings and under addition).

Let V λ be the set of λ -eigenvectors of A . If v V λ then k v V λ (as we saw above). If v 1 , v 2 V λ then A ( v 1 + v 2 ) = A v 1 + A v 2 = λ v 1 + λ v 2 = λ ( v 1 + v 2 ) , so v 1 + v 2 is also a λ -eigenvector.

We call this subspace the λ -eigenspace. In all the examples we saw earlier (except I ), the eigenspaces were 1-dimensional eigenlines (one free variable). So a matrix gives you a collection of preferred directions or subspaces (its eigenspaces), which tell you something about the matrix (e.g. if it's a rotation matrix, its axis will be one of these subspaces).

Example:

For the example A = ( 2 1 1 1 ) we found the eigenvalues 3 ± 5 2 and eigenvectors ( x - 1 ± 5 2 x ) . We now draw these two eigenlines (in red).

The eigenlines of the Arnold cat map

Note that these eigenlines look orthogonal; indeed, you can check that they are! You do this by taking the dot product of the eigenvectors (it's zero). This is true more generally for symmetric matrices (i.e. matrices A such that A = A T ).