# 10. Dot product, 2

## 10. Dot product, 2

### Transposition

You may have noticed that the definition of the dot product looks a lot like matrix multiplication. In fact, it is a special case of matrix multiplication: $v_{1}w_{1}+\cdots+v_{n}w_{n}=\begin{pmatrix}v_{1}&\cdots&v_{n}\end{pmatrix}% \begin{pmatrix}w_{1}\\ \vdots\\ w_{n}\end{pmatrix}.$ Technically, the matrix product gives a 1-by-1 matrix whose unique entry is the dot product, but let's not be too pedantic.

Here, we took the column vector $v=\begin{pmatrix}v_{1}\\ \vdots\\ v_{n}\end{pmatrix}$ and turned it on its side to get a row vector which we call the transpose of $v$ , written: $v^{T}=\begin{pmatrix}v_{1}&\cdots&v_{n}\end{pmatrix}$ .

More generally, you can transpose a matrix:

Definition:

Given an $m$ -by-$n$ matrix $M$ with entries $M_{ij}$ , we get an $n$ -by-$m$ matrix $M^{T}$ whose $ij$ th entry is $M_{ji}$ , i.e. $(M^{T})_{ij}=M_{ji}$

$\begin{pmatrix}1&2\\ 3&4\end{pmatrix}^{T}=\begin{pmatrix}1&3\\ 2&4\end{pmatrix}.$

$\begin{pmatrix}1&2&3\\ 4&5&6\end{pmatrix}^{T}=\begin{pmatrix}1&4\\ 2&5\\ 3&6\end{pmatrix}$ .

So the rows of $M$ become the columns of $M^{T}$ .

With all this in place, we observe that the dot product $v\cdot w$ is $v^{T}w$ .

Lemma:

$(AB)^{T}=B^{T}A^{T}$ .

Writing out the $ij$ th entry of $(AB)^{T}$ using index notation, we get: $(AB)^{T}_{ij}=(AB)_{ji}=\sum_{k}A_{jk}B_{ki}$ Similarly expanding $B^{T}A^{T}$ we get $(B^{T}A^{T})_{ij}=\sum_{k}(B^{T})_{ik}(A^{T})_{kj}=\sum_{k}B_{ki}A_{jk}$ The two expressions differ only by the order of the factors $A_{jk}$ and $B_{ki}$ .

The order of these factors doesn't matter: $A_{jk}$ and $B_{ki}$ are just numbers (entries of $A$ and $B$ ), so they commute. This is one reason index notation is so convenient: it converts expressions involving noncommuting objects like matrices into expressions involving commuting quantities (numbers).