If R from U(1) to G L n C is a smooth representation then there exists a basis of C n with respect to which R of e to the i theta is the diagonal matrix with diagonal entries e to the i m_1 theta, dot dot dot, e to the i m_n theta where m_1 up to m_n are integers called the *weights* of the representation. A fancier way of saying this is that C n equals the direct sum from i equals 1 to n of V_i where each V_i is a 1-dimensional subrepresentation and equals R_1 direct sum dot dot dot R_n with R_i equals R restricted to V_i.

# Representations of U(1), part 1

## Representations of U(1), part 1

We now state the classification theorem for representations of U(1) and illustrate it with an example. We will prove the theorem next time.

This means that the basis with respect to which R has this form is a basis of *eigenvectors* little v_1, dot dot dot, little v_n. Moreover, little v_k is *simultaneously* an eigenvector of all the matrices R of e to the i theta with eigenvalue e to the i m_k theta.

Take R of e to the i theta equals the 2-by-2 matrix cos theta, minus sine theta, sine theta cos theta. The characteristic polynomial of this matrix is det of cos theta minus lambda, minus sine theta, sine theta, cos theta minus lambda, equals lambda squared minus 2 lambda cos theta plus 1 so the eigenvalues are lambda equals (2 cos theta plus or minus square root of (4 cos squared theta minus 4)), all over 2, which equals cos theta plus or minus i sine theta, or e to the plus or minus i theta. Therefore the weights of this representation are plus or minus 1.

The eigenvectors are i, 1 and minus i, 1. These are therefore our vectors little v_1 in big V_1 and little v_2 in big V_2. With respect to this basis of eigenvectors, R of e to the i theta equals the diagonal matrix with diagonal entries e to the i theta, e to the minus i theta.

## Pre-class exercise

Check that (\pm i,1) is an eigenvector of \begin{pmatrix}\cos\theta & -\sin\theta \\ \sin\theta & \cos\theta\end{pmatrix} with eigenvalue e^{\pm i\theta}.