This is why we require that exp t X in O(n) for all t: we want to be able to put ourselves near to the identity to take logs.

# Example: O(n)

## The orthogonal group

### Definition

Let O(n) denote the group of n-by-n orthogonal matrices. Recall that a matrix M is called orthogonal if M transpose M equals the identity. Geometrically, this means that M preserves dot products. This is because v dot w equals v transpose w, so Mv dot Mw equals v transpose M transpose M w, which equals v transpose w for all v and w if and only if M transpose M equals the identity. Preserving dot products means you preserve lengths of and angles between vectors, so orthogonal matrices represent things like rotations and reflections.

### Lie algebra

By definition, the Lie algebra little o n of O(n) is the set of matrices X in little gl n R such that exp t X is in O(n) for all t in R This means (exp (t X)) transpose times exp (t X) equals the identity. We can take the transpose term-by-term in the exponential power series, and, since (M N) all transpose equals N transpose M transpose, we get (M to the n) transpose equals (M transpose) to the n, so (exp(t X)) transpose equals exp t (X transpose).

Therefore exp t X transpose equals the inverse of exp t X and we've seen that exp t X inverse equals exp of minus t X, so we deduce that X is in little o n if and only if exp t X transpose equals exp of minus t X for all t in R

Certainly if X transpose equals minus X then exp t X transpose equals exp minus t X, so X is in little o n. Therefore any *antisymmetric matrix* is in the Lie algebra of the orthogonal group. It will turn out that this is everything. To see this, note that by taking t to be sufficiently small we can assume that t X transpose and minus t X lie in our favourite neighbourhood of the zero matrix and exp t X transpose and exp minus t X are in our favourite neighbourhood of the identity. Our favourite neighbourhood of the identity is the one on which the local logarithm is defined, so we can take the logarithm of each side and deduce that t X transpose equals minus t X for all sufficiently small t, hence X transpose equals minus X.

We have now shown that:

little o n is the set of antisymmetric matrices, that is the matrices X such that X transpose equals minus X.

### Example

We saw that exp of the 2-by-2 matrix 0, minus theta, theta, 0 equals the matrix cos theta, minus sin theta, sin theta, cos theta. Here, the matrix 0, minus theta, theta, 0 is antisymmetric and its exponential is a rotation matrix (hence orthogonal).

### Additional remarks

Another, nicer, way to deduce that exp(t X transpose) equals exp of minus X for all t implies X transpose equals minus X, is to differentiate the equation exp t X transpose exp tX equals the identity with respect to t. The right-hand side is constant, so its derivative is zero. The left-hand side can be evaluated using the product rule, and we get: 0 equals d by dt of exp t X transpose times exp t X, which equals X transpose exp t X transpose times exp t X plus exp t X transpose X exp t X for all t. If we now set t=0 we get 0 equals X transpose plus X so X transpose equals minus X.

Note that O(n) is topologically closed. To see this, consider the map F from little gl n R to little g l n R defined by F of M equals M transpose M. We have O(n) equals the preimage of I under F by definition. Note that F is continuous because the matrix entries of M transpose M are polynomials in the matrix entries of M (and polynomials are continuous). Therefore if M k in O(n) is a sequence of orthogonal matrices converging to a matrix M then F of M k equals I for all k, so the limit as k goes to infinity of F M k equals the identity, and since F is continuous we have F of M equals the limit of F M k as k goes to infinity, which equals the identity, so M is orthogonal.

Whenever your group is cut out by a continuous equation like this it will be a topologically closed group of matrices.

## Pre-class exercise

If X and Y are in little o n then so is X bracket Y.