Given two representations $R:G\to GL(V)$ and $S:G\to GL(W)$ , we can construct a representation $R\oplus S:G\to GL(V\oplus W)$ by setting $$(R\oplus S)(g)=\left(\begin{array}{cc}\hfill R(g)\hfill & \hfill 0\hfill \\ \hfill 0\hfill & \hfill S(g)\hfill \end{array}\right)$$ (a block matrix).
New representations from old
New representations from old
We've now encountered four representations of $SU(2)$ and we've claimed there are irreps of every complex dimension. To see this, we will need some recipes for constructing new representations out of old representations.
Direct sum
Here's one recipe we already know.
This is no use for us because we're looking for irreducible representations, and this is not irreducible unless $R$ or $S$ is the zerorepresentation (because $R$ and $S$ are subrepresentations).
Tensor product
Given two representations $R:G\to GL(V)$ and $S:G\to GL(W)$ , we can construct a representation $R\otimes S:G\to GL(V\otimes W)$ as follows.

The vector space $V\otimes W$ is constructed by taking a basis ${e}_{1},\mathrm{\dots},{e}_{m}$ of $V$ and a basis ${f}_{1},\mathrm{\dots},{f}_{n}$ of $W$ and using the symbols ${e}_{i}\otimes {f}_{j}$ as a basis of $V\otimes W$ . This is $mn$ dimensional. The vectors in $V\otimes W$ are things like ${e}_{1}\otimes {f}_{1}$ or ${e}_{1}\otimes {f}_{2}\frac{1}{2}{e}_{3}\otimes {f}_{5}$ .

$(R\otimes S)(g)$ is the linear map which acts as follows on tensors $v\otimes w$ : $$(R\otimes S)(g)(v\otimes w)=(R(g)v)\otimes (S(g)w).$$ It's enough to specify what it does on such tensors because all our basis vectors have this form.
Take $R=S$ to be the standard representation of $SU(2)$ (so $V=W={\mathbf{C}}^{2}$ ). Pick the standard basis ${e}_{1}$ , ${e}_{2}$ of $V$ and ${f}_{1}$ , ${f}_{2}$ of $W$ (these are the same bases, I'm just keeping different letters for clarity). A basis for ${\mathbf{C}}^{2}\otimes {\mathbf{C}}^{2}$ is given by ${e}_{1}\otimes {f}_{1}$ , ${e}_{1}\otimes {f}_{2}$ , ${e}_{2}\otimes {f}_{1}$ , ${e}_{2}\otimes {f}_{2}$ . Let $g=\left(\begin{array}{cc}\hfill a\hfill & \hfill b\hfill \\ \hfill \overline{b}\hfill & \hfill \overline{a}\hfill \end{array}\right)\in SU(2)$ . For the standard representation, $R(g)=S(g)=\left(\begin{array}{cc}\hfill a\hfill & \hfill b\hfill \\ \hfill \overline{b}\hfill & \hfill \overline{a}\hfill \end{array}\right)$ . This means $R(g){e}_{1}=a{e}_{1}\overline{b}{e}_{2}$ and $R(g){e}_{2}=b{e}_{1}+\overline{a}{e}_{2}$ (similarly for $S$ with $e$ 's replaced by $f$ 's).
Let's calculate $$(R\otimes S)(g)({e}_{1}\otimes {f}_{1})=(R(g){e}_{1})\otimes (S(g){f}_{1})=(a{e}_{1}\overline{b}{e}_{2})\otimes (a{f}_{1}\overline{b}{f}_{2})$$
Multiplying out brackets, we get: $${a}^{2}{e}_{1}\otimes {f}_{1}\overline{b}a{e}_{2}\otimes {f}_{1}a\overline{b}{e}_{2}\otimes {f}_{1}+{\overline{b}}^{2}{e}_{2}\otimes {f}_{2}.$$
This means that the first column of the 4by4 matrix $(R\otimes S)(g)$ is $$\left(\begin{array}{cccc}\hfill {a}^{2}\hfill & \hfill \mathrm{?}\hfill & \hfill \mathrm{?}\hfill & \hfill \mathrm{?}\hfill \\ \hfill a\overline{b}\hfill & \hfill \mathrm{?}\hfill & \hfill \mathrm{?}\hfill & \hfill \mathrm{?}\hfill \\ \hfill a\overline{b}\hfill & \hfill \mathrm{?}\hfill & \hfill \mathrm{?}\hfill & \hfill \mathrm{?}\hfill \\ \hfill {\overline{b}}^{2}\hfill & \hfill \mathrm{?}\hfill & \hfill \mathrm{?}\hfill & \hfill \mathrm{?}\hfill \end{array}\right).$$
You can figure out the second column by computing $(R\otimes S)(g)({e}_{1}\otimes {f}_{2})$ , etc. (This will be an exercise).
It will turn out that the tensor product does not usually give an irreducible representation (indeed we will have a lot of fun later decomposing tensor products into subrepresentations), but it is a much more interesting recipe for representations than direct summation.
Symmetric powers
Given a representation $R:G\to GL(V)$ , take ${R}^{\otimes n}:G\to GL({V}^{\otimes n})$ . This is not irreducible: we will produce a subrepresentation consisting of symmetric tensors.
Take $V={\mathbf{C}}^{2}$ to be the standard representation and $n=2$ . We found a basis ${e}_{1}\otimes {e}_{1}$ , ${e}_{1}\otimes {e}_{2}$ , ${e}_{2}\otimes {e}_{1}$ , ${e}_{2}\otimes {e}_{2}$ of ${({\mathbf{C}}^{2})}^{\otimes 2}$ . The tensors ${e}_{1}\otimes {e}_{1}$ and ${e}_{2}\otimes {e}_{2}$ are symmetric in the sense that when I switch the two factors I get the same tensor back. The other two basis tensors are not, but the combination ${e}_{1}\otimes {e}_{2}+{e}_{2}\otimes {e}_{1}$ is symmetric because if I switch all factors in all monomials then I get ${e}_{2}\otimes {e}_{1}+{e}_{1}\otimes {e}_{2}$ , which is the same combination back again. By contrast, ${e}_{1}\otimes {e}_{2}{e}_{2}\otimes {e}_{1}$ is antisymmetric: we'll talk more about that at a later date.
We will now see that the symmetric tensors span a subrepresentation, called ${\mathrm{Sym}}^{n}(V)$ . Here's the idea. Given any tensor, I can produce something symmetric in a canonical way, as illustrated by the following example.
Suppose $V={\mathbf{C}}^{3}$ with basis ${e}_{1}$ , ${e}_{2}$ , ${e}_{3}$ . Consider ${V}^{\otimes 3}$ . To symmetrise ${e}_{1}\otimes {e}_{2}\otimes {e}_{3}\in {V}^{\otimes 3}$ , we take $$\frac{1}{6}({e}_{1}\otimes {e}_{2}\otimes {e}_{3}+{e}_{2}\otimes {e}_{1}\otimes {e}_{3}+{e}_{1}\otimes {e}_{3}\otimes {e}_{2}+{e}_{3}\otimes {e}_{2}\otimes {e}_{1}+{e}_{2}\otimes {e}_{3}\otimes {e}_{1}+{e}_{3}\otimes {e}_{1}\otimes {e}_{2}).$$
This is just summing all six permutations of the three factors and dividing by the number of permutations (in this case 6) so that if we start with a symmetric tensor then we get the same tensor back.
Define the
${\mathrm{Sym}}^{n}(V)\subset {V}^{\otimes n}$ is the image of the averaging map, i.e. the set of all averaged tensors, which are symmetric by construction.
If $R:G\to GL(V)$ is a representation of $G$ then ${\mathrm{Sym}}^{n}(V)$ is a subrepresentation of ${V}^{\otimes n}$ .
We will first show that the averaging map is a morphism of representations ${V}^{\otimes n}\to {V}^{\otimes n}$ . Then we'll show that the image of a morphism is a subrepresentation.
Recall that a morphism of representations is a map $L:V\to W$ such that $L\circ R(g)=S(g)\circ L$ for all $g\in G$ . We are therefore trying to show that $$\mathrm{Av}(R{(g)}^{\otimes n}({v}_{1}\otimes \mathrm{\cdots}\otimes {v}_{n})=R{(g)}^{\otimes n}\mathrm{Av}({v}_{1}\otimes \mathrm{\cdots}\otimes {v}_{n}).$$
From the definition of the tensor product and the averaging map, we have $$\mathrm{Av}(R{(g)}^{\otimes n}({v}_{1}\otimes \mathrm{\cdots}\otimes {v}_{n}))=\mathrm{Av}(R(g){v}_{1}\otimes \mathrm{\cdots}\otimes R(g){v}_{n})=\frac{1}{n!}\sum _{\sigma \in {S}_{n}}R(g){v}_{\sigma (1)}\otimes \mathrm{\cdots}\otimes R(g){v}_{\sigma (n)}.$$
On the other hand, $$R{(g)}^{\otimes n}\mathrm{Av}({v}_{1}\otimes \mathrm{\cdots}\otimes {v}_{n})=R{(g)}^{\otimes n}\frac{1}{n!}\sum _{\sigma \in {S}_{n}}{v}_{\sigma (1)}\otimes \mathrm{\cdots}\otimes {v}_{\sigma (n)}=\frac{1}{n!}\sum _{\sigma \in {S}_{n}}R(g){v}_{\sigma (1)}\otimes \mathrm{\cdots}\otimes R(g){v}_{\sigma (n)}.$$
These are identical, so we see that the averaging map is a morphism of representations.
We will now prove a separate lemma telling us that the image of a morphism of representations is a subrepresentation, which will complete the proof.
If $L:V\to W$ is a morphism of representations from $R:G\to GL(V)$ to $S:G\to GL(W)$ then its image $\{L(v)\in W:v\in V\}$ is a subrepresentation of $W$ .
Take $L(v)$ in the image of $L$ . Apply $S(g)$ to it. We have $S(g)L(v)=L(R(g)v)$ because $L$ is a morphism, so $S(g)L(v)$ is in the image of $L$ . This tells us that the image of $L$ is a subrepresentation.
In the case of the symmetric power this is telling us that ${\mathrm{Sym}}^{n}(V)$ is a subrepresentation of ${V}^{\otimes n}$ .
Take $V={\mathbf{C}}^{2}$ , the standard representation of $SU(2)$ . Consider ${\mathrm{Sym}}^{3}({\mathbf{C}}^{2})$ . By averaging the basis elements of ${V}^{\otimes 3}$ we end up with a basis for ${\mathrm{Sym}}^{3}({\mathbf{C}}^{2})$ : ${e}_{1}\otimes {e}_{1}\otimes {e}_{1}$ , $\frac{1}{3}({e}_{1}\otimes {e}_{1}\otimes {e}_{2}+{e}_{1}\otimes {e}_{2}\otimes {e}_{1}+{e}_{2}\otimes {e}_{1}\otimes {e}_{1})$ , $\frac{1}{3}({e}_{1}\otimes {e}_{2}\otimes {e}_{2}+{e}_{2}\otimes {e}_{1}\otimes {e}_{2}+{e}_{2}\otimes {e}_{2}\otimes {e}_{1})$ , ${e}_{2}\otimes {e}_{2}\otimes {e}_{2}$ .
This is a 4dimensional representation, and it will turn out to be irreducible. You can label the elements here with polynomials: ${e}_{1}^{3}$ , ${e}_{1}^{2}{e}_{2}$ , ${e}_{1}{e}_{2}^{2}$ , ${e}_{2}^{3}$ . Given a homogeneous monomial $M$ in ${e}_{1}$ and ${e}_{2}$ of degree $n$ , there's a unique way to write down a symmetric tensor whose monomials reduce to $M$ when you remove the tensor symbols. You can therefore think of ${\mathrm{Sym}}^{n}(V)$ as the space of homogeneous polynomials of degree $n$ in the basis elements of $V$ .
It's an exercise to see that ${\mathrm{Sym}}^{n}({\mathbf{C}}^{2})$ is $(n+1)$ dimensional. These will turn out to be our irreducible representations of $SU(2)$ .
Preclass exercise
Let $R:SU(2)\to GL(2,\mathbf{C})$ and $S:SU(2)\to GL(2,\mathbf{C})$ be two copies of the standard representation. Figure out the full 4by4 matrix $(R\otimes S)\left(\begin{array}{cc}\hfill a\hfill & \hfill b\hfill \\ \hfill \overline{b}\hfill & \hfill \overline{a}\hfill \end{array}\right)$ .