Im currently using a javaCV software called procamcalib to calibrate a Kinect-Projector setup, which has the Kinect RGB Camera as origin. This setup consists solely of a Kinect RGB Camera (Im roughly using the Kinect just as an ordinary camera at the moment) and one Projector. This calibration software uses LibFreenect (OpenKinect) as the Kinect Driver.
Once the software completes its process, it will give me the intrinsics and extrinsics parameters of both the camera and the projector, which are being thrown at an OpenGL software to validate the calibration and is where a few problems begins. Once the Projection and Modelview are correctly set, I should be able to fit what is seen by the Kinect with what is being projected, but in order to achieve this I have to do a manual translation in all 3 axis and this last part isnt making any sense to me! Could you guys please help me sorting this out?
The SDK used to retrieve Kinect data is OpenNi (not the latest 2.x version, it should be 1.5.x)
I’ll explain exactly what Im doing to reproduce this error. The calibration parameters is used as follows:
The Projection matrix is set as:
r = width/2.0f; l = -width/2.0f; t = height/2.0f; b = -height/2.0f; alpha = fx; beta = fy; s = 90; xo = cx; yo = cy; X = kinectCalibration.c_near + kinectCalibration.c_far; Y = kinectCalibration.c_near*kinectCalibration.c_far; d = kinectCalibration.c_near - kinectCalibration.c_far; float* glOrthoMatrix = (float*)malloc(16*sizeof(float)); glOrthoMatrix = 2/(r-l); glOrthoMatrix = 0.0f; glOrthoMatrix = 0.0f; glOrthoMatrix = (r+l)/(l-r); glOrthoMatrix = 0.0f; glOrthoMatrix = 2/(t-b); glOrthoMatrix = 0.0f; glOrthoMatrix = (t+b)/(b-t); glOrthoMatrix = 0.0f; glOrthoMatrix = 0.0f; glOrthoMatrix = 2/d; glOrthoMatrix = X/d; glOrthoMatrix = 0.0f; glOrthoMatrix = 0.0f; glOrthoMatrix = 0.0f; glOrthoMatrix = 1; printM( glOrthoMatrix, 4, 4, true, "glOrthoMatrix" ); float* glCameraMatrix = (float*)malloc(16*sizeof(float)); glCameraMatrix = alpha; glCameraMatrix = s; glCameraMatrix = -xo; glCameraMatrix = 0.0f; glCameraMatrix = 0.0f; glCameraMatrix = beta; glCameraMatrix = -yo; glCameraMatrix = 0.0f; glCameraMatrix = 0.0f; glCameraMatrix = 0.0f; glCameraMatrix = X; glCameraMatrix = Y; glCameraMatrix = 0.0f; glCameraMatrix = 0.0f; glCameraMatrix = -1; glCameraMatrix = 0.0f; float* glProjectionMatrix = algMult( glOrthoMatrix, glCameraMatrix );
And the Modelview matrix is set as:
proj_loc = new Vec3f( proj_RT, proj_RT, proj_RT ); proj_fwd = new Vec3f( proj_RT, proj_RT, proj_RT ); proj_up = new Vec3f( proj_RT, proj_RT, proj_RT ); proj_trg = new Vec3f( proj_RT + proj_RT, proj_RT + proj_RT, proj_RT + proj_RT ); gluLookAt( proj_loc, proj_loc, proj_loc, proj_trg, proj_trg, proj_trg, proj_up, proj_up, proj_up );
And finally the camera is displayed and moved around with:
glPushMatrix(); glTranslatef(translateX, translateY, translateZ); drawRGBCamera(); glPopMatrix();
where the translation values are manually adjusted with the keyboard until I have a visual match (I’m projecting on the calibration board what the Kinect-rgb camera is seeing, so I manually adjust the opengl-camera until the projected pattern matches the printed pattern).
My question here is WHY do I have to make this manual adjustment? The modelview and projection setup should take care of it.
I was also wandering if there are any problems when switching drivers like that, since OpenKinect is used for calibration and OpenNi for validation. This came at mind when researching another popular calibration tool called RGBDemo, where it says that if using LibFreenect backend a Kinect calibration is needed.
Will a calibration go wrong if made with a driver and displayed with another?
Does anyone think it’ll be easier to achieve success if this is done with OpenCV rather than OpenGL ?
I can upload more things if needed, I just need to know what you guys need to be able to help me out