Lie groups and Lie algebras: Fortnight 1 Questions
Grading
Remember, you'll need to do at least 4, 4, or 5 questions to get a C, B, or A respectively. For a B or an A, this will need to include 2 (respectively 3) $\alpha $ questions.
β questions
Answer as many as you want.
1. Exp and eigenvectors
(Watch the videos about matrix exponentiation and its properties first)

Suppose that $M$ is a matrix and $v$ is an eigenvector of $M$ with eigenvalue $\lambda $ . Show that $v$ is an eigenvector of $\mathrm{exp}(M)$ with eigenvalue ${e}^{\lambda}$ .

For a vector $v=(x,y,z)$ , define ${K}_{v}:=\left(\begin{array}{ccc}\hfill 0\hfill & \hfill z\hfill & \hfill y\hfill \\ \hfill z\hfill & \hfill 0\hfill & \hfill x\hfill \\ \hfill y\hfill & \hfill x\hfill & \hfill 0\hfill \end{array}\right)$ . Find the $0$ eigenspace of ${K}_{v}$ and deduce that $v$ is fixed by $\mathrm{exp}\left(\theta {K}_{v}\right)$ for any $\theta \in \mathbf{R}$ .
2. Exp and rotations
Depends on the previous question

Given vectors $v,w\in {\mathbf{R}}^{3}$ , show that ${K}_{v}w=v\times w$ .

Given a unit vector $u$ , pick vectors $v$ and $w$ such that $u,v,w$ is a righthanded orthonormal basis of ${\mathbf{R}}^{3}$ (so $u\times v=w$ and $u\times w=v$ ). Calculate $\mathrm{exp}(\theta {K}_{u})v$ and $\mathrm{exp}(\theta {K}_{u})w$ and deduce that $\mathrm{exp}(\theta {K}_{u})$ is a rotation. What are the axis and angle of rotation?
3. Jacobi identity
(Technically no prerequisites, but the video about abstract Lie algebras is related.)

Prove the Jacobi identity $$[X,[Y,Z]]+[Y,[Z,X]]+[Z,[X,Y]]=0$$ for matrices $X,Y,Z$ where $[A,B]:=ABBA$ .

Given a matrix $X$ , define ${\mathrm{ad}}_{X}:\U0001d524\U0001d529(n,\mathbf{R})\to \U0001d524\U0001d529(n,\mathbf{R})$ by ${\mathrm{ad}}_{X}(Y):=[X,Y]$ . Prove that $[{\mathrm{ad}}_{X},{\mathrm{ad}}_{Y}]={\mathrm{ad}}_{[X,Y]}$ . (Hint: evaluate both sides on some $Z\in \U0001d524\U0001d529(n,\mathbf{R})$ and use the Jacobi identity.)
4. Unitary group
(Watch the video about orthogonal matrices first)
The unitary group $U(n)$ is the group of complex $n$ by$n$ matrices $M$ such that ${M}^{\u2020}M=I$ . Here, ${M}^{\u2020}$ is the conjugatetranspose of $M$ , for example $${\left(\begin{array}{cc}\hfill a\hfill & \hfill b\hfill \\ \hfill c\hfill & \hfill d\hfill \end{array}\right)}^{\u2020}=\left(\begin{array}{cc}\hfill \overline{a}\hfill & \hfill \overline{c}\hfill \\ \hfill \overline{b}\hfill & \hfill \overline{d}\hfill \end{array}\right).$$ Show that the Lie algebra $\U0001d532(n)$ of $U(n)$ is the space of $n$ by$n$ skewHermitian matrices $\{M:{M}^{\u2020}=M\}$ .
5. Example: surjective exp
(You need to remember Jordan normal form and watch the first video about matrix exponentiation).
We will show that $\mathrm{exp}:\U0001d524\U0001d529(2,\mathbf{C})\to GL(2,\mathbf{C})$ is surjective.

Calculate $\mathrm{exp}\left(\begin{array}{cc}\hfill a\hfill & \hfill b\hfill \\ \hfill 0\hfill & \hfill a\hfill \end{array}\right)$ , and hence find a logarithm for the matrix $\left(\begin{array}{cc}\hfill \lambda \hfill & \hfill 1\hfill \\ \hfill 0\hfill & \hfill \lambda \hfill \end{array}\right)$ for any $\lambda \ne 0\in \mathbf{C}$ .

If $M\in GL(2,\mathbf{C})$ , let $N={P}^{1}MP$ be its Jordan normal form. Prove that $N=\mathrm{exp}(X)$ for some $X\in \U0001d524\U0001d529(2,\mathbf{C})$ and deduce that $M=\mathrm{exp}(Y)$ for some $Y\in \U0001d524\U0001d529(2,\mathbf{C})$ .
α questions
Answer as many as you want. You will need to do well on at least 2 to get a B^{} and at least 3 to get an A^{}.
6. BCH formula
(Watch the video about the BCH formula first. WARNING: This question gets messy before it gets better.)
Show that the thirdorder term in the BakerCampbellHausdorff formula is $$\frac{1}{12}[X,[X,Y]]\frac{1}{12}[Y,[X,Y]].$$
7. det(exp) = exp(tr), part 1
(Technically, this has no prerequisites, but the video about $SL(2,\mathbf{C})$ is related)
Let $H\in \U0001d524\U0001d529(n,\mathbf{R})$ be a matrix.

Show that $det(I+tH)=1+t\mathrm{Tr}(H)+\mathcal{O}({t}^{2})$ where $\mathrm{Tr}(H)={H}_{11}+{H}_{22}+\mathrm{\cdots}+{H}_{nn}$ is the trace of $H$ . Hint: Write it out in full, i.e. $$det(I+tH)=det\left(\begin{array}{cccc}\hfill 1+t{H}_{11}\hfill & \hfill t{H}_{12}\hfill & \hfill \mathrm{\cdots}\hfill & \hfill t{H}_{1n}\hfill \\ \hfill t{H}_{21}\hfill & \hfill 1+t{H}_{22}\hfill & \hfill \hfill & \hfill \mathrm{\vdots}\hfill \\ \hfill \mathrm{\vdots}\hfill & \hfill \hfill & \hfill \mathrm{\ddots}\hfill & \hfill \mathrm{\vdots}\hfill \\ \hfill t{H}_{n1}\hfill & \hfill t{H}_{n2}\hfill & \hfill \mathrm{\cdots}\hfill & \hfill 1+t{H}_{nn}\hfill \end{array}\right)$$

Deduce that if $M$ is invertible then $det(M+tH)=det(M)+tdet(M)\mathrm{Tr}({M}^{1}H)+\mathcal{O}({t}^{2})$ .

If $A(t)$ is a path of matrices, use the previous parts to show that $$\frac{d}{dt}det(A(t))=det(A(t))\mathrm{Tr}\left(A{(t)}^{1}\frac{dA}{dt}(t)\right)$$ (Hint: Use the Taylor expansion of $A(t)$ to find the Taylor expansion of $det(A(t))$ and drop terms of higher order.)
Note: By a "path of matrices" I basically mean a matrix $A(t)$ whose entries depend on $t$ . You may assume that this dependence is sufficiently differentiable to run your argument.
8. det(exp) = exp(tr), part 2
(Depends on the previous question, the properties of the exponential map, and for the final part the definition of the Lie algebra of a matrix group)

Using the formula for $\frac{d}{dt}det(A(t))$ from the previous question, show that if $\varphi (t):=det(\mathrm{exp}(tH))$ then $\varphi $ satisfies the differential equation $$\frac{d\varphi}{dt}=\varphi (t)\mathrm{Tr}(H).$$

Assuming that differential equations have unique solutions (with specified initial conditions), prove that $det(\mathrm{exp}(tH))=\mathrm{exp}(t\mathrm{Tr}(H))$ .

Using this formula, show that the Lie algebra $\U0001d530\U0001d529(n,\mathbf{R})$ of $SL(n,\mathbf{R})=\{M\in GL(n,\mathbf{R}):det(M)=1\}$ is the space of matrices with trace zero.
9. Example: nonsurjective exp
(You need to remember Jordan normal form. Recall that $\U0001d530\U0001d529(2,\mathbf{R})$ is the group of 2by2 tracefree matrices and $SL(2,\mathbf{R})$ is the group of 2by2 matrices with determinant 1. I also highly recommend trying the previous question first).
We will show that $\mathrm{exp}:\U0001d530\U0001d529(2,\mathbf{R})\to SL(2,\mathbf{R})$ is not surjective.

Given $B\in \U0001d530\U0001d529(2,\mathbf{R})$ , show that its Jordan normal form (when considered as a complex matrix) is one of $$\left(\begin{array}{cc}\hfill \lambda \hfill & \hfill 0\hfill \\ \hfill 0\hfill & \hfill \lambda \hfill \end{array}\right),\lambda \in \mathbf{R}\text{or}i\mathbf{R},\left(\begin{array}{cc}\hfill 0\hfill & \hfill 1\hfill \\ \hfill 0\hfill & \hfill 0\hfill \end{array}\right).$$

Deduce that one of the following must be true about $\mathrm{exp}(B)$ :

both its eigenvalues are positive;

both its eigenvalues are unit complex numbers;

both its eigenvalues are equal to 1.


By exhibiting a matrix in $SL(2,\mathbf{R})$ whose eigenvalues satisfy none of these conditions, deduce that $\mathrm{exp}:\U0001d530\U0001d529(2,\mathbf{R})\to SL(2,\mathbf{R})$ is not surjective.
10. Simultaneous strict uppertriangularisability
(Just uses good ol'fashioned linear algebra. It will become relevant if you choose to do a project about Lie's theorem or Engel's theorem on solvable (respectively nilpotent) Lie algebras.)
Recall that a linear map $N$ is called nilpotent if ${N}^{k}=0$ for some $k$ . In this question, you may use the fact that, if $N$ is a nilpotent linear map then there is some basis with respect to which its matrix is strictly uppertriangular.
Let $V$ be a vector space and suppose that $M:V\to V$ and $N:V\to V$ are linear maps which are both nilpotent and which commute with one another: $MN=NM$ .

Show that if $v\in {M}^{j}V$ then $Nv\in {M}^{j}V$ . We will write ${N}_{j}:{M}^{j}V\to {M}^{j}V$ for the restriction ${N}_{{M}^{j}V}$ .

Suppose we have picked a basis of ${M}^{j}V$ such that the matrix of ${N}_{j}$ is strictly uppertriangular. Show that we can extend this to get a basis of ${M}^{j1}V$ for which the matrix of ${N}_{j1}$ is strictly uppertriangular. (Hint: Pick a complement $C$ for ${M}^{j}V\subset {M}^{j1}V$ and consider the map $C\to C$ induced by $N$ ).

Use this to prove by induction that we can find a basis of $V$ making both $M$ and $N$ simultaneously uppertriangular.
11. Simultaneous upper triangularisability
(Uses Q.10).
Let $V$ be a complex vector space and $X:V\to V$ be a linear map. Recall that the generalised $\lambda $ eigenspace of a linear map $X:V\to V$ is the subspace $${V}_{\lambda}:=\{v\in V:{(X\lambda I)}^{k}v=0\text{for some}k\}.$$ In this question, you may assume that $V={\oplus}_{\lambda}{V}_{\lambda}$ : that is, the generalised eigenspaces span $V$ and ${V}_{\lambda}\cap {V}_{\mu}=0$ unless $\lambda =\mu $ .
Now let $X:V\to V$ and $Y:V\to V$ be linear maps and suppose they commute with one another.

Show that $Y$ preserves the generalised eigenspaces of $X$ . Let ${Y}_{\lambda}:{V}_{\lambda}\to {V}_{\lambda}$ be the restriction ${Y}_{{V}_{\lambda}}$ .

Let ${V}_{\lambda ,\mu}$ be the $\mu $ generalised eigenspace of ${Y}_{\lambda}$ . Show that it is preserved by $X$ .

Deduce that both $X$ and $Y$ are blockdiagonal with respect to the splitting $V={\oplus}_{\lambda ,\mu}{V}_{\lambda ,\mu}$ .

Show that we can pick a basis for ${V}_{\lambda ,\mu}$ such that the corresponding blocks of $X$ and $Y$ are uppertriangular. (Hint: Consider $X\lambda I$ and $Y\mu I$ and use the previous question about strict upper triangularisability.)
12. Technicality about local coordinates
(Depends on an optional video about invertibility of $\mathrm{exp}:\U0001d524\to G$ )
By mimicking the proof that $\mathrm{exp}$ is locally invertible, prove the following claim (which was used in this optional video to prove that $\mathrm{exp}:\U0001d524\to G$ is locally invertible for a matrix group $G$ ):
If $\U0001d524\subset \U0001d524\U0001d529(n,\mathbf{R})$ is a Lie subalgebra and $W\subset \U0001d524\U0001d529(n,\mathbf{R})$ is a vector space complement for $\U0001d524$ , show that the map $F:\U0001d524\oplus W\to GL(n,\mathbf{R})$ defined by $F(v,w)=\mathrm{exp}(v)\mathrm{exp}(w)$ is locally invertible.