# Inverse Orthogonal Matrix

From what I understand (please correct if I’m wrong) if I have an orthogonal 4x4 matrix that contains no scaling and only rotation and translation, I can obtain the inverse matrix by transposing the 9 rotation values and negating the translation values.

example:

11 12 13 x
21 22 23 y
31 32 33 z
0 0 0 1

becomes

11 21 31 -x
12 22 32 -y
13 23 33 -z
0 0 0 1

Is this true?

One more question:

If I multiply this inverse matrix by another matrix to get the relative tranform between the two original matrices, do I have to take out the translation vector first and then add it in after at the end?

Thanks for any help.

Just wanted to clarify:

If the first mat is M1 and its inverse, the second mat, is M1^-1 and I want to multiply M1^-1 by another matrix M2 to get the relative transform between M1 and M2, will I need to remove the translation vector first and then account for it at the end?

Sorry if my first post seems confusing.

To the first part, yes,
To the 2nd part, no.

Consider what you’re doing here. Your taking a matrix that is effectively (in Mv=v’ matrix order) TR. You want its inverse (TR)^-1. That’s (R^-1)(T^-1). For the rotation part (an orthonormal 3x3 matrix), inverse = transpose, so just transpose it. For the translate part, you build the inverse (negate the translation), and then run it through R^-1 to map it to the final translation that goes in the matrix. Just write out an example with 2 full 4x4 matrices for (R^-1 * T^-1), do a proper matrix product, and you’ll see what I mean.

These rotation * translation matrices are often termed “rigid transformations” which are a special case of “affine transformations”, so you can get some google hits by search for those in combination with inverse. For instance:

Thanks a lot, very helpful!

Thanks for your help, just one more question to clear it up for me a bit better.
So to get my inverse matrix, I separate the rotation and translation into their own matrices, transpose the rotation matrix, negate the translation vector in the translation matrix and then multiply those two matrices to get my final inverse matrix?
Your help is much appreciated. Thanks again.

Yes, that would work.

However, if you write out this full matrix product output symbolically in terms of the inputs (or think about it a bit) computationally you’ll see there’s no need to do a full 4x4 matrix-matrix multiply here to compute the product matrix.

You can skip to the chase and just write the final product matrix in terms source matrix. Just take the upper-left 3x3 of the source matrix, and transpose it in writing it to the dest matrix’s upper left 3x3. Then, do a vector-matrix multiply to map the negative translation from the source matrix through the transposed upper-left 3x3, and store that in the dest matrix’s translation elements. Init the remaining dest matrix elements (4th row) with the identity values (0,0,0,1). With this you’ve saved a 4x4 matrix-matrix multiply and just done a 4x4 vector-matrix multiply. Plus there was no need to build any intermediate matrices.

Just write out the matrix product of R^-1 * T^-1 and this becomes obvious.

For more details or code, do a quick google on fast inverse orthonormal matrix and you’ll get some good hits. For instance:

Also note that even though your subject said “Orthogonal Matrix”, here in this thread we are only talking about how to do a fast invert of an “Orthonormal Matrix” (i.e. rotate*translate only! No scales! i.e. transform basis vectors are unit length). This is a subset of orthogonal matrices.

If your matrix is merely orthogonal not orthonormal (meaning you might have scales mixed in there with the rotate and translate, and thus your transform basis vectors aren’t unit length) – i.e. it’s not a rigid transform – you need to tweak this a bit to account for the scales. David Eberly’s Geometric Tools code (linked above) covers this.

Yes the matrix is orthonormal, I get those terms confused. I do not use scaling at all.
I cannot thank you enough for this information, I knew it was possible, I studied matrices in Linear Algebra, but that was in high school - 40 years ago! (57 now…)

Much obliged! Cheers