Lie groups and Lie algebras: Fortnight 1 Questions
Grading
Remember, you'll need to do at least 4, 4, or 5 questions to get a C, B, or A respectively. For a B or an A, this will need to include 2 (respectively 3) \alpha questions.
β questions
Answer as many as you want.
1. Exp and eigenvectors
(Watch the videos about matrix exponentiation and its properties first)
-
Suppose that M is a matrix and v is an eigenvector of M with eigenvalue \lambda. Show that v is an eigenvector of exp M with eigenvalue e to the lambda.
-
For a vector v = (x, y, z), define K_v to be the 3-by-3 matrix 0, minus z, y, z, 0, minus x, minus y, x, 0. Find the 0-eigenspace of K_v and deduce that v is fixed by exp of theta K_v for any real number theta.
2. Exp and rotations
Depends on the previous question
-
Given vectors v and w in R 3, show that K_v times w equals v cross product w.
-
Given a unit vector u, pick vectors v and w such that u, v, w is a right-handed orthonormal basis of R 3 (so u cross product v equals w and u cross product w equals minus v). Calculate (exp of theta K_u) applied to v and (exp of theta K_u) applied to v and deduce that exp of theta K_u is a rotation. What are the axis and angle of rotation?
3. Jacobi identity
(Technically no prerequisites, but the video about abstract Lie algebras is related.)
-
Prove the Jacobi identity X bracket (Y bracket Z) + Y bracket (Z bracket X) + Z bracket (X bracket Y) equals 0 for matrices X, Y, Z where M bracket N equals M N minus N M.
-
Given a matrix X, define little ad_X from little g l n R to little g l n R by little ad_X of Y equals X bracket Y. Prove that little ad_X bracket little ad_Y equals little ad of (X bracket Y). (Hint: evaluate both sides on some Z in little g l n R and use the Jacobi identity.)
4. Unitary group
(Watch the video about orthogonal matrices first)
The unitary group U(n) is the group of complex n-by-n matrices M such that M dagger M equals the identity. Here, M dagger is the conjugate-transpose of M, for example a, b; c, d dagger equals a bar, c bar; b bar, d bar. Show that the Lie algebra \mathfrak{u}(n) of U(n) is the space of n-by-n skew-Hermitian matrices M such that M dagger equals minus M.
5. Example: surjective exp
(You need to remember Jordan normal form and watch the first video about matrix exponentiation).
We will show that exp from little g l 2 C to big G L 2 C is surjective.
-
Calculate exp of the 2-by-2 matrix a, b; 0, a, and hence find a logarithm for the matrix lambda, 1, 0, lambda for any nonzero complex number lambda.
-
If M is in G L 2 C, let N equals P inverse M P be its Jordan normal form. Prove that N equals exp of X for some X in little g l 2 C and deduce that exp of Y for some Y in little g l 2 C.
α questions
Answer as many as you want. You will need to do well on at least 2 to get a B- and at least 3 to get an A-.
6. BCH formula
(Watch the video about the BCH formula first. WARNING: This question gets messy before it gets better.)
Show that the third-order term in the Baker-Campbell-Hausdorff formula is one over 12 X bracket (X bracket Y) minus one over 12 Y bracket X bracket Y.
7. det(exp) = exp(tr), part 1
(Technically, this has no prerequisites, but the video about S L 2 C is related)
Let H in little g l n R be a matrix.
-
Show that det of I plus t H equals 1 plus t times trace H plus terms of order t squared and higher where trace of H equals the sum of diagonal entries in H is the trace of H. Hint: Write it out in full, i.e. det of I + t H equals det of the n-by-n matrix 1 + t H_{1 1}, t H_{1 2} across to t H_{1 n}; t H_{2 1}, 1 + t H_{2 2}, etc down to, t H_{n 1}, t H_{n 2}, across to t H_{n n minus 1}, 1 + t H_{n n}.
-
Deduce that if M is invertible then det of M plus t H equals det of M plus t det M times trace of M invese H plus terms of order t squared.
-
If A of t is a path of matrices, use the previous parts to show that d by d t of det A of t equals (det A of t) times the trace of A of t inverse times d A by d t. (Hint: Use the Taylor expansion of A(t) to find the Taylor expansion of \det(A(t)) and drop terms of higher order.)
Note: By a "path of matrices" I basically mean a matrix A(t) whose entries depend on t. You may assume that this dependence is sufficiently differentiable to run your argument.
8. det(exp) = exp(tr), part 2
(Depends on the previous question, the properties of the exponential map, and for the final part the definition of the Lie algebra of a matrix group)
-
Using the formula for d by d t of det A of t from the previous question, show that if phi of t equals det of exp t H then phi satisfies the differential equation d phi by d t equals phi times trace of H
-
Assuming that differential equations have unique solutions (with specified initial conditions), prove that det of exp t H equals exp of (t trace H).
-
Using this formula, show that the Lie algebra little s l n R of big S L n R equals the set of matrices M in G L n R such that det M equals 1 is the space of matrices with trace zero.
9. Example: non-surjective exp
(You need to remember Jordan normal form. Recall that little s l 2 R is the group of 2-by-2 tracefree matrices and S L 2 R is the group of 2-by-2 matrices with determinant 1. I also highly recommend trying the previous question first).
We will show that exp from little s l 2 R to big S L 2 R is not surjective.
-
Given B in little s l 2 R, show that its Jordan normal form (when considered as a complex matrix) is one of lambda, 0; 0, minus lambda, with lambda a real or pure imaginary number, or else it's 0, 1; 0, 0.
-
Deduce that one of the following must be true about exp B:
-
both its eigenvalues are positive;
-
both its eigenvalues are unit complex numbers;
-
both its eigenvalues are equal to 1.
-
-
By exhibiting a matrix in big S L 2 R whose eigenvalues satisfy none of these conditions, deduce that exp from little s l 2 R to big S L 2 R is not surjective.
10. Simultaneous strict upper-triangularisability
(Just uses good ol'-fashioned linear algebra. It will become relevant if you choose to do a project about Lie's theorem or Engel's theorem on solvable (respectively nilpotent) Lie algebras.)
Recall that a linear map N is called nilpotent if N^k=0 for some k. In this question, you may use the fact that, if N is a nilpotent linear map then there is some basis with respect to which its matrix is strictly upper-triangular.
Let V be a vector space and suppose that M\colon V\to V and N\colon V\to V are linear maps which are both nilpotent and which commute with one another: M N = N M.
-
Show that if v\in M^jV then Nv\in M^jV. We will write N_j\colon M^jV\to M^jV for the restriction N|_{M^jV}.
-
Suppose we have picked a basis of M^jV such that the matrix of N_j is strictly upper-triangular. Show that we can extend this to get a basis of M^{j-1}V for which the matrix of N_{j-1} is strictly upper-triangular. (Hint: Pick a complement C for M^{j}V\subset M^{j-1}V and consider the map C\to C induced by N).
-
Use this to prove by induction that we can find a basis of V making both M and N simultaneously upper-triangular.
11. Simultaneous upper triangularisability
(Uses Q.10).
Let V be a complex vector space and X\colon V\to V be a linear map. Recall that the generalised \lambda-eigenspace of a linear map X\colon V\to V is the subspace V_\lambda := \{v\in V\,:\,(X-\lambda I)^kv = 0\mbox{ for some }k\}. In this question, you may assume that V=\bigoplus_\lambda V_\lambda: that is, the generalised eigenspaces span V and V_\lambda \cap V_\mu = 0 unless \lambda=\mu.
Now let X\colon V\to V and Y\colon V\to V be linear maps and suppose they commute with one another.
-
Show that Y preserves the generalised eigenspaces of X. Let Y_\lambda\colon V_\lambda\to V_\lambda be the restriction Y|_{V_\lambda}.
-
Let V_{\lambda,\mu} be the \mu-generalised eigenspace of Y_\lambda. Show that it is preserved by X.
-
Deduce that both X and Y are block-diagonal with respect to the splitting V=\bigoplus_{\lambda,\mu}V_{\lambda,\mu}.
-
Show that we can pick a basis for V_{\lambda,\mu} such that the corresponding blocks of X and Y are upper-triangular. (Hint: Consider X-\lambda I and Y-\mu I and use the previous question about strict upper triangularisability.)
12. Technicality about local coordinates
(Depends on an optional video about invertibility of exp from little g to G)
By mimicking the proof that exp is locally invertible, prove the following claim (which was used in this optional video to prove that exp from little g to G is locally invertible for a matrix group G):
If little g inside little g l n R is a Lie subalgebra and W inside little g l n R is a vector space complement for little g, show that the map F from little g direct sum W to G L n R defined by F of v, w equals exp v times exp w is locally invertible.