Hello,
I use perspective model to display a mesh obj. I want to adapt the distance of my object from the camera to have a precise size of my 3D object on my image rendering.
I get the perspective matrix :
Matx44f(2 * fx / w, 0, 1 - 2 * cx / w, 0, 0, 2 * fy / h,
2 * cy / h - 1, 0, 0, 0, (f + n) / (n - f),
(2 * f * n) / (n - f), 0, 0, -1, 0);
With
Matx33f K = Matx33f(fx_fy, 0, width / 2.f, 0, fx_fy, height / 2.f, 0, 0, 1);
// camera image size
int width = 1280;
int height = 720;
// near and far plane of the OpenGL view frustum
float zNear = 0.2;
float zFar = 1200.0;
float fx_fy = 1067;
My model view matrix is as follow
Matx44f modelViewMatrix = lookAtMatrix * (pose);
Matx44f modelViewProjectionMatrix = projectionMatrix * modelViewMatrix;
And the position of my 3D object is:
distance = 250.0; // 500 , 750, 1000
float tx = 0;
float ty = 0;
float tz = distance;
With
I rendering my object in the center of my camera and I measure my object width on the rendering image at many distances.
I get that results:
distance(mm) → width(pixels)
1000 → 133
750 → 179
500 → 271
250 → 645
I don’t really understand the pixel value. Mathematically, between the measurement at distance 250mm and 1000mm, I should get a x4 factor on the width measurement ?
Did I made something wrong ?