Not really OpenGL, but related anyway…
Background: I have two datasets (points in 3D cartesian coordinates) representing the same object, but with different rotations and translations. I solve the optimal transformation matrix for going from dataset A to dataset B with a least squares approximation, which works fine.
Problem: Due to inherent errors (measurment, numerical and algorithmical) the two datasets do not represent a 1:1 match, meaning that the transformation matrix does not represent an orthogonal coordinate system (i.e. the transformation is not only composed of rotation and translation, but also of scaling). Since I want to get the optimal rotation and translation, I need to get rid of the scaling somehow.
Solution: I figure that the best way is to find an orthogonal transformation matrix that best matches the non-orthogonal matrix that I get from the least squares solution. Here, “best matches” would probably mean the smallest sum of (squared?) distances of each of the three coordinate system axes.
Question: Is there a known method for doing this? I have not found any such solution in any book or on the net, and I have come up dry in my own attempts at finding an analytical solution to the problem.
My current strategy is to do a numerical solution in the following way (e’x, e’y, e’z are the normalized non-orthogonal coordinate system axes, and ex, ey, ez are the desired orthogonal coordinate system axes):
- Calculate m = (e’x + e’y + e’z)/|e’x + e’y + e’z| (the “diagonal”)
- Create an orthogonal coordinate system that satisfies (ex + ey + ez)/|ex + ey + ez| = m
- Rotate the orthogonal coordinate system about m until the smallest error, err=|e’x-ex|+|e’y-ey|+|e’z-ez|, is found
I figure this should be reasonably accurate and fast (only one degree of freedom, namely the rotational angle about m).