In the previous video, we decomposed as . If has a basis then has a basis consisting of quadratic monomials like etc, and the trivial summand is spanned by .
We also mentioned that is "basically the same as" quadratic forms in and , so that is "basically the same as" quadratic expressions in the coefficients of a quadratic form. We know that there is a distinguished quadratic expression in the coefficients of a quadratic form which vanishes if and only if the has a repeated root, namely , which looks an awful lot like . The reason why the factor of 4 is missing is that we have not been sufficiently careful about the isomorphism we used to identify quadratic forms with . In this video, we will be more careful, and will make a comment about how to generalise this to higher degree polynomials or polynomials in more variables: what are the invariant quantities in those cases?
Quadratic forms as a representation of SL(2,C)
A quadratic form can be written as follows: so we can identify quadratic forms in and with 2-by-2 symmetric matrices like the one in the middle of this expression. We will allow the group to act by coordinate changes on -space in the usual way: The row vector goes to and so the quadratic form turns into More explicitly, if then the new matrix for the quadratic form is So if we just keep track of the coefficients of the quadratic form, this transformation is So we can think of the space of quadratic forms as the 3-dimensional -space, and the formula above tells us how to think of that space as a representation of except that it is not really a representation: it is an "antirepresentation". You can see this by looking at what happens if we act on first using and then . This gives , and the quadratic form becomes You can see that the matrix hits the matrix of the quadratic form first, followed by ! This is back to front: if we write for the 3-by-3 matrix above, then is an antirepresentation in the sense that . We can fix this by taking the transpose: . It is this tranposed representation to which we will apply the theory we've developed.
If you want to think about what this trick of transposing the matrices means, it is essentially that instead of thinking of as entries of a vector, we're thinking of them as linear functions on the space of quadratic forms which return the , and coefficients respectively.
Weights and action of the Lie algebra
We now figure out the weights of this representation and how acts. Let's take . Substituting this into our 3-by-3 matrix we get . So has weight , has weight and has weight .
Now let's take . We get . To find the action of the Lie algebra element , we differentiate with respect to and set : Note that this is not the same as just substituting , , , : the formula for relies on being in the Lie group , not in the Lie algebra.
Therefore the action of is , , .
We know that our 3-dimensional representation ( of on quadratic forms) is isomorphic to because it has weights . However, if we look at the action of on the basis versus the action of on , we get something a bit different. Namely, we get , , and . So although these representations are isomorphic, the identification , , is not an isomorphism of representations! It turns out we need to take , , .
For example, then we get , . Now if we look at the invariant element , this becomes .
This is the beginning of a subject called invariant theory. You could do the same trick with cubic forms in 2 variables, looking for a trivial 1-dimensional subrepresentation of for some which vanishes if and only if the cubic form has a repeated root. It turns out that this lives in . Using our methods, you can check that this representation contains a trivial 1-dimensional subrepresentation, and find an element in the zero weight-space with . However, if you do it the way we did in the previous video, you need to be careful to use the right identification of with cubic forms (which turns out to be , , , for the cubic form ).
You can also do this for forms in more variables, using groups like .