A decomposition of a representation is a splitting where each is a subrepresentation of , that is whenever . In this case, each matrix is block-diagonal (if we write it with respect to a basis of vectors from , ordered so that basis vectors from before basis vectors from if ): We will write this as .
Complete reducibility
Complete reducibility
Decomposition
One of the key techniques for studying representations is to break them down into smaller subrepresentations.
In such a decomposition, we would like the pieces to be as ``small'' as possible, because then our matrices will be concentrated very close to the diagonal and most entries will be zero. These ``smallest pieces'' are called irreducible representations:
A subrepresentation
is called
Orthogonal complements: idea
So we would like to decompose our representations as direct sums of irreducible subrepresentations; in this case we say our representation is completely reducible. That's not always possible (we'll see an example in one of the exercises). However, we will focus on groups for which it is possible, namely the compact groups (matrix groups where the matrix entries are bounded). As a first step, we prove:
If admits an invariant Hermitian inner product for the representation then can be decomposed into irreducible summands.
I'll give you the idea of the proof before defining what an invariant Hermitian inner product is; suffice it to say that it's something like a dot product.
If is not irreducible then it contains a subrepresentation . The orthogonal complement of with respect to the Hermitian inner product will also be a subrepresentation and . So if is not itself irreducible then it can be decomposed as a direct sum of subrepresentations. Applying the same reasoning to the summands, if either is not irreducible, we can decompose further; and so on and so on.
Eventually this process terminates because the dimension of the summands decreases each time you decompose. Either you hit an irreducible summand, or you keep going all the way down and find a 1-dimensional summand, but 1-dimensional representations are automatically irreducible: they have no proper subspaces, let alone proper subrepresentations.
I'm not saying you always break up into 1-dimensional pieces, but that these provide a "safety-blanket": if you get all the way down to 1-d then you're guaranteed to be irreducible.
To complete the proof, it remains to:
-
define the term "invariant Hermitian inner product",
-
define the orthogonal complement of a subspace with respect to an invariant Hermitian inner product,
-
prove that the orthogonal complement of a subrepresentation is a subrepresentation.
Hermitian inner products
A Hermitian inner product is a map (i.e. it eats two complex vectors and and returns a complex number ) such that:
-
is real and positive unless .
-
for all ,
-
for all and ,
-
.
The final condition actually follows from (1) and (2), so we don't really need to take it as an axiom.
This notion is supposed to be a replacement of "dot product" that works well with complex vectors. The problem with just taking the dot product of complex vectors is that is a complex number, and we would like the length of to be , which would then also be a complex number. If instead we take then we get a real number which is positive unless . The axioms above are intended to capture the important properties of the "standard Hermitian inner product" .
When orthogonal complements are subrepresentations
A Hermitian inner product is invariant for a representation if for all and .
Given a representation
, a subrepresentation
, and an invariant Hermitian inner product on
, the
If , we want to show that for all . To see this, we need to compute and see that it's zero.
Using invariance, we get Since is a subrepresentation, . Since , we therefore get as desired.
Pre-class exercise
Let denote the group of real numbers with addition. Why is the representation not irreducible? Can you find a decomposition of it?