32. Finding eigenvalues

32. Finding eigenvalues

Characteristic polynomial

In the last video, we introduced the equation A v = lambda v. For each lambda, this gives us an equation for v. The question we will now answer is: for which lambda in C does this equation have a nonzero solution v?


A v = lambda v has a nonzero solution if and only if lambda is a root of the characteristic polynomial of A: det of A minus t I equals zero

Here, t is just a dummy variable we've introduced (not one of the components of v or anything like that). The characteristic polynomial is a polynomial in t of degree n. We'll do some examples, then prove the theorem.



Suppose A = 2, minus 1; 1, 0. The characteristic polynomial is: det of A minus t I equals det of 2, minus 1; 1, 0 minus t, 0; 0, t, which equals det of 2 minus t, minus 1; 1, minus t, which equals minus t times (2 minus t) plus 1, or t squared minus 2 t + 1 The roots of this quadratic are a half of 2 plus or minus square root of 4 minus 4, i.e. 1, i.e. this is a double root (the discriminant of the quadratic equation is zero). This means that the only eigenvalue of this matrix is 1: had we picked any other value for lambda, we would not have been able to solve A v = lambda v.

You will come to love the formula for solving quadratics; it lets you find the eigenvalues of any 2-by-2 matrix. By contrast, a 3-by-3 matrix will have a cubic characteristic polynomial. Whilst there is a formula for solving cubics, it's not nice. For 4-by-4 matrices, it gets still worse. For 5-by-5 and bigger matrices, the characteristic polynomial is a quintic or higher degree polynomial, and there's (provably) no general formula for the solution of a general quintic in terms of taking kth roots etc.


Let A = 0, minus 1; 1, 0. Then det of A minus t I equals det of minus t, minus 1; 1, minus t, which equals t squared + 1 The roots (eigenvalues) are plus or minus i. This is why, even though our matrix is real, we may need to deal with complex numbers when we start working with eigenvalues and eigenvectors.

Let's figure out the eigenvectors. For lambda = i, we need to solve A v = i v: 0, minus 1; 1, 0 times x, y = i x, i y. Multiplying this out gives: minus y, x equals i x, i y which implies y = minus i x and x = i y. The second equation follows from the first if you multiply by i. The eigenvectors for lambda = i are those of the form x, minus i x.

For lambda = minus i, we need to solve A v = minus i v, which gives y = i x, and the eigenvectors are those of the form x, i x.

Proof of theorem

If there exists a nonzero solution v to A v = lambda v then (A minus lambda I) all times v equals 0. This implies that A minus lambda I is not invertible; otherwise we get v = the inverse of A minus lambda I times 0, which equals 0. Therefore det of A minus lambda I equals 0, so lambda is a root of det of A minus t I.

In fact, these are all "if and only if" statements. The only nonobvious one is to see that if A minus lambda I is not invertible then there exists a nonzero v such that (A minus lambda I) v = 0 (you might like to think about why that's true).