Complete reducibility

Complete reducibility

Decomposition

One of the key techniques for studying representations is to break them down into smaller subrepresentations.

Definition:

A decomposition of a representation R from big G to big G L n C is a splitting C n equals V_1 direct sum V_2 direct sum dot dot dot V_k where each V_i inside C n is a subrepresentation of C n, that is R of g applied to v is in V_i whenever v is in V_i. In this case, each matrix R of g is block-diagonal (if we write it with respect to a basis of vectors from V_1 dot dot dot V_k, ordered so that basis vectors from V_i before basis vectors from V_j if i is strictly less than j): with blocks R of g restricted to V_i down the diagonal and zeros elsewhere. We will write this as R equals R restricted to V_1 direct sum R restricted to V_2 dot dot dot direct sum R restricted to V_k.

In such a decomposition, we would like the pieces V_i to be as ``small'' as possible, because then our matrices will be concentrated very close to the diagonal and most entries will be zero. These ``smallest pieces'' are called irreducible representations:

Definition:

A subrepresentation V inside C n is called irreducible if it has no proper subrepresentations, that is any subrepresentation is either V or the zero-subspace consisting of just the origin.

Orthogonal complements: idea

So we would like to decompose our representations as direct sums of irreducible subrepresentations; in this case we say our representation is completely reducible. That's not always possible (we'll see an example in one of the exercises). However, we will focus on groups for which it is possible, namely the compact groups (matrix groups where the matrix entries are bounded). As a first step, we prove:

Lemma:

If C n admits an invariant Hermitian inner product for the representation R from big G to big G L n C then R can be decomposed into irreducible summands.

I'll give you the idea of the proof before defining what an invariant Hermitian inner product is; suffice it to say that it's something like a dot product.

If C n is not irreducible then it contains a subrepresentation U. The orthogonal complement U perp of U with respect to the Hermitian inner product will also be a subrepresentation and C n equals U direct sum U perp. So if C n is not itself irreducible then it can be decomposed as a direct sum of subrepresentations. Applying the same reasoning to the summands, if either is not irreducible, we can decompose further; and so on and so on.

Eventually this process terminates because the dimension of the summands decreases each time you decompose. Either you hit an irreducible summand, or you keep going all the way down and find a 1-dimensional summand, but 1-dimensional representations are automatically irreducible: they have no proper subspaces, let alone proper subrepresentations.

Remark:

I'm not saying you always break up into 1-dimensional pieces, but that these provide a "safety-blanket": if you get all the way down to 1-d then you're guaranteed to be irreducible.

To complete the proof, it remains to:

  • define the term "invariant Hermitian inner product",

  • define the orthogonal complement of a subspace with respect to an invariant Hermitian inner product,

  • prove that the orthogonal complement of a subrepresentation is a subrepresentation.

Hermitian inner products

Definition:

A Hermitian inner product is a map angle brackets from C n times C n to C (i.e. it eats two complex vectors v and w and returns a complex number v angle brackets w) such that:

  1. v angle brackets v is real and positive unless v = 0.

  2. v angle brackets u equals the complex conjugate of u angle brackets v for all u and v in C n,

  3. u angle bracket a v_1 plus b v_2 equals a times u angle brackets v_1 plus b times u angle brackets v_2 for all u and v in C n and a and b in C,

  4. the angle brackets of a u_1 plus b u_2 with v equals a bar u_1 angle brackets v plus b bar u_2 angle brackets v.

The final condition actually follows from (1) and (2), so we don't really need to take it as an axiom.

Remark:

This notion is supposed to be a replacement of "dot product" that works well with complex vectors. The problem with just taking the dot product v dot w of complex vectors is that v dot v equals the sum of v_k squared is a complex number, and we would like the length of v to be square root of v dot v, which would then also be a complex number. If instead we take v angle brackets v to be the sum of v_k bar times v_k then we get a real number which is positive unless v = 0. The axioms above are intended to capture the important properties of the "standard Hermitian inner product" v angle brackets w equals sum of v_k bar times w_k.

When orthogonal complements are subrepresentations

Definition:

A Hermitian inner product angle brackets is invariant for a representation R from big G to big G L n C if R of g applied to v angle brackets R of g applied to w equals v angle brackets w for all g in big G and v and w in C n.

Lemma:

Given a representation R from big G to big G L n C, a subrepresentation U inside C n, and an invariant Hermitian inner product on C n, the orthogonal complement U perp, defined to be the set of w in C n such that u angle brackets w equals zero for all u in U is a subrepresentation.

If w in U perp, we want to show that R of g applied to w is in U perp for all g in big G. To see this, we need to compute u angle brackets R of g applied to w and see that it's zero.

Using invariance, we get u angle brackets R of g applied to w, equals the angle brackets of R of (g inverse) applied to u with (R of g inverse times R of g) applied to w, which equals the angle brackets of R of (g inverse) applied to u with w Since U is a subrepresentation, R of (g inverse) applied to u is still in U. Since w is in U perp, we therefore get R of (g inverse) applied to u angle bracket w equals zero as desired.

Pre-class exercise

Exercise:

Let R, plus denote the group of real numbers with addition. Why is the representation R of x equals the 2-by-2 matrix 1, x; 0, 1 not irreducible? Can you find a decomposition of it?