Given two representations and , we can construct a representation by setting (a block matrix).
New representations from old
New representations from old
We've now encountered four representations of and we've claimed there are irreps of every complex dimension. To see this, we will need some recipes for constructing new representations out of old representations.
Direct sum
Here's one recipe we already know.
This is no use for us because we're looking for irreducible representations, and this is not irreducible unless or is the zero-representation (because and are subrepresentations).
Tensor product
Given two representations and , we can construct a representation as follows.
-
The vector space is constructed by taking a basis of and a basis of and using the symbols as a basis of . This is -dimensional. The vectors in are things like or .
-
is the linear map which acts as follows on tensors : It's enough to specify what it does on such tensors because all our basis vectors have this form.
Take to be the standard representation of (so ). Pick the standard basis , of and , of (these are the same bases, I'm just keeping different letters for clarity). A basis for is given by , , , . Let . For the standard representation, . This means and (similarly for with 's replaced by 's).
Let's calculate
Multiplying out brackets, we get:
This means that the first column of the 4-by-4 matrix is
You can figure out the second column by computing , etc. (This will be an exercise).
It will turn out that the tensor product does not usually give an irreducible representation (indeed we will have a lot of fun later decomposing tensor products into subrepresentations), but it is a much more interesting recipe for representations than direct summation.
Symmetric powers
Given a representation , take . This is not irreducible: we will produce a subrepresentation consisting of symmetric tensors.
Take to be the standard representation and . We found a basis , , , of . The tensors and are symmetric in the sense that when I switch the two factors I get the same tensor back. The other two basis tensors are not, but the combination is symmetric because if I switch all factors in all monomials then I get , which is the same combination back again. By contrast, is antisymmetric: we'll talk more about that at a later date.
We will now see that the symmetric tensors span a subrepresentation, called . Here's the idea. Given any tensor, I can produce something symmetric in a canonical way, as illustrated by the following example.
Suppose with basis , , . Consider . To symmetrise , we take
This is just summing all six permutations of the three factors and dividing by the number of permutations (in this case 6) so that if we start with a symmetric tensor then we get the same tensor back.
Define the
is the image of the averaging map, i.e. the set of all averaged tensors, which are symmetric by construction.
If is a representation of then is a subrepresentation of .
We will first show that the averaging map is a morphism of representations . Then we'll show that the image of a morphism is a subrepresentation.
Recall that a morphism of representations is a map such that for all . We are therefore trying to show that
From the definition of the tensor product and the averaging map, we have
On the other hand,
These are identical, so we see that the averaging map is a morphism of representations.
We will now prove a separate lemma telling us that the image of a morphism of representations is a subrepresentation, which will complete the proof.
If is a morphism of representations from to then its image is a subrepresentation of .
Take in the image of . Apply to it. We have because is a morphism, so is in the image of . This tells us that the image of is a subrepresentation.
In the case of the symmetric power this is telling us that is a subrepresentation of .
Take , the standard representation of . Consider . By averaging the basis elements of we end up with a basis for : , , , .
This is a 4-dimensional representation, and it will turn out to be irreducible. You can label the elements here with polynomials: , , , . Given a homogeneous monomial in and of degree , there's a unique way to write down a symmetric tensor whose monomials reduce to when you remove the tensor symbols. You can therefore think of as the space of homogeneous polynomials of degree in the basis elements of .
It's an exercise to see that is -dimensional. These will turn out to be our irreducible representations of .
Pre-class exercise
Let and be two copies of the standard representation. Figure out the full 4-by-4 matrix .