Given an $m$ -by-$n$ matrix $M$ with entries ${M}_{ij}$ , we get an $n$ -by-$m$ matrix ${M}^{T}$ whose $ij$ th entry is ${M}_{ji}$ , i.e. $${({M}^{T})}_{ij}={M}_{ji}$$

# 10. Dot product, 2

## 10. Dot product, 2

### Transposition

You may have noticed that the definition of the dot product looks a lot like matrix multiplication. In fact, it is a special case of matrix multiplication: $${v}_{1}{w}_{1}+\mathrm{\cdots}+{v}_{n}{w}_{n}=\left(\begin{array}{ccc}\hfill {v}_{1}\hfill & \hfill \mathrm{\cdots}\hfill & \hfill {v}_{n}\hfill \end{array}\right)\left(\begin{array}{c}\hfill {w}_{1}\hfill \\ \hfill \mathrm{\vdots}\hfill \\ \hfill {w}_{n}\hfill \end{array}\right).$$ Technically, the matrix product gives a 1-by-1 matrix whose unique entry is the dot product, but let's not be too pedantic.

Here, we took the column vector $v=\left(\begin{array}{c}\hfill {v}_{1}\hfill \\ \hfill \mathrm{\vdots}\hfill \\ \hfill {v}_{n}\hfill \end{array}\right)$
and turned it on its side to get a row vector which we call the *transpose* of $v$
, written: $${v}^{T}=\left(\begin{array}{ccc}\hfill {v}_{1}\hfill & \hfill \mathrm{\cdots}\hfill & \hfill {v}_{n}\hfill \end{array}\right)$$
.

More generally, you can transpose a matrix:

${\left(\begin{array}{cc}\hfill 1\hfill & \hfill 2\hfill \\ \hfill 3\hfill & \hfill 4\hfill \end{array}\right)}^{T}=\left(\begin{array}{cc}\hfill 1\hfill & \hfill 3\hfill \\ \hfill 2\hfill & \hfill 4\hfill \end{array}\right).$

${\left(\begin{array}{ccc}\hfill 1\hfill & \hfill 2\hfill & \hfill 3\hfill \\ \hfill 4\hfill & \hfill 5\hfill & \hfill 6\hfill \end{array}\right)}^{T}=\left(\begin{array}{cc}\hfill 1\hfill & \hfill 4\hfill \\ \hfill 2\hfill & \hfill 5\hfill \\ \hfill 3\hfill & \hfill 6\hfill \end{array}\right)$ .

So the rows of $M$ become the columns of ${M}^{T}$ .

With all this in place, we observe that the dot product $v\cdot w$ is ${v}^{T}w$ .

${(AB)}^{T}={B}^{T}{A}^{T}$ .

Writing out the $ij$ th entry of ${(AB)}^{T}$ using index notation, we get: $${(AB)}_{ij}^{T}={(AB)}_{ji}=\sum _{k}{A}_{jk}{B}_{ki}$$ Similarly expanding ${B}^{T}{A}^{T}$ we get $${({B}^{T}{A}^{T})}_{ij}=\sum _{k}{({B}^{T})}_{ik}{({A}^{T})}_{kj}=\sum _{k}{B}_{ki}{A}_{jk}$$ The two expressions differ only by the order of the factors ${A}_{jk}$ and ${B}_{ki}$ .

The order of these factors doesn't matter: ${A}_{jk}$ and ${B}_{ki}$ are just numbers (entries of $A$ and $B$ ), so they commute. This is one reason index notation is so convenient: it converts expressions involving noncommuting objects like matrices into expressions involving commuting quantities (numbers).