34. Eigenspaces

34. Eigenspaces

In all the examples we've seen so far, the eigenvectors have all had a free variable in them. For example, in the last video, we found the eigenvectors for the matrix A=(32523-12-32-3112) to be:

For the matrix (2111) we found the eigenvalue 3±52 had eigenvectors (x-1±52x) . All of these have the free parameter x .

This is a general fact:

Lemma:

If v is an eigenvector of A with eigenvalue λ ("a λ -eigenvector") then so is kv for any k𝐂 .

A(kv)=kAv=kλv=λ(kv) .

So for example, the vectors (x-xx) are all just rescalings of (1-11) . Indeed, people often say things like "the eigenvector is (1-11) ", when they mean "the eigenvectors are all the rescalings of (1-11) ". If you write this kind of thing in your answers, that's fine.

Example:

Suppose we have the matrix I=(1001) . The characteristic polynomial is det(I-λI)=det(1-λ001-λ)=(1-λ)2 , so λ=1 is the only eigenvalue. Any vector v satisfies Iv=v , so any vector (xy) is a 1 -eigenvector. This has two free parameters, so it is an eigenplane, not just an eigenline: there is a whole plane of eigenvectors for the same eigenvalue.

Theorem:

The set of eigenvectors with eigenvalue λ form a (complex) subspace of 𝐂n (i.e. closed under complex rescalings and under addition).

Let Vλ be the set of λ -eigenvectors of A . If vVλ then kvVλ (as we saw above). If v1,v2Vλ then A(v1+v2)=Av1+Av2=λv1+λv2=λ(v1+v2), so v1+v2 is also a λ -eigenvector.

We call this subspace the λ -eigenspace. In all the examples we saw earlier (except I ), the eigenspaces were 1-dimensional eigenlines (one free variable). So a matrix gives you a collection of preferred directions or subspaces (its eigenspaces), which tell you something about the matrix (e.g. if it's a rotation matrix, its axis will be one of these subspaces).

Example:

For the example A=(2111) we found the eigenvalues 3±52 and eigenvectors (x-1±52x) . We now draw these two eigenlines (in red).

The eigenlines of the Arnold cat map

Note that these eigenlines look orthogonal; indeed, you can check that they are! You do this by taking the dot product of the eigenvectors (it's zero). This is true more generally for symmetric matrices (i.e. matrices A such that A=AT ).