Matrix Transpose Is Its Inverse

In the case where the rows (or columns) of a matrix define vectors that are orthogonal, the inverse of the matrix is identical to its transpose. This is very simple to prove using the following matrix:

ux
vx
nx
0
uy
vy
ny
0
uz
vz
nz
0
0
0
0
1
Eq1

Let’s assume the vectors, <ux,uy,yz>, <vx,vy,vz>, and <nx,ny,nz> are orthogonal (at right angles) to each other. If you take the dot product of two orthogonal vectors the result is 0.0 because the dot product calculates the cosine of the angle between the vectors, and the cos(90) is 0.0. In addition, the dot product of a vector times itself is always 1.0 because the cos(0) is 1.0. To summarize:

dotProduct(<u>, <v>) === 0.0
dotProduct(<v>, <n>) === 0.0
dotProduct(<n>, <u>) === 0.0

dotProduct(<u>, <u>) === 1.0
dotProduct(<v>, <v>) === 1.0
dotProduct(<n>, <n>) === 1.0

Notice what happens when you multiple the matrix in Eq1 with its transpose. (Perform the multiplication by clicking on the multiplication sign, *.

ux
vx
nx
0
uy
vy
ny
0
uz
vz
nz
0
0
0
0
1
*ux
uy
uz
0
vx
vy
vz
0
nx
ny
nz
0
0
0
0
1
Eq2

Each term of the multiplication is identical to a dot product of two vectors, and we know the results of the calculations because the vectors are orthogonal. Therefore, we have

ux
vx
nx
0
uy
vy
ny
0
uz
vz
nz
0
0
0
0
1
*ux
uy
uz
0
vx
vy
vz
0
nx
ny
nz
0
0
0
0
1
===1
0
0
0
0
1
0
0
0
0
1
0
0
0
0
1
Eq3

which demonstrates that the transpose of our original matrix is its inverse – since the result of their multiplication is the identity matrix. Note that the order of multiplication does not matter, as in the equation:

ux
uy
uz
0
vx
vy
vz
0
nx
ny
nz
0
0
0
0
1
*ux
vx
nx
0
uy
vy
ny
0
uz
vz
nz
0
0
0
0
1
===1
0
0
0
0
1
0
0
0
0
1
0
0
0
0
1
Eq4
Next Section - Derivative Rules