Hello, I am making an augmented reality application, without object recoqnition on Android platform. I am drawing OpenGL layer over camera preview. I want to set up real world units to match the units in OpenGL layer.
When I draw a plane with width 1f to a distance of 1f then it matches on the camera the width of a door at distance of 1 meter. The doors width however is ~0.5 meters.
Currently I set up the perspective and viewport like this:
gl.glViewport(0, 0, width, height);
GLU.gluPerspective(gl, 52.0f, (float)viewWidth / (float)viewHeight, 0.1f, 100f);
Where 52.0f is approximately the maximum degree that I measured that my camera can view horizontally. viewWidth and viewHeight are the width and height of the camera view.
How can I set up the OpenGL layer so that a OpenGL plane with a width 0.5f at the distance 1f would appear the same with as the door at the distance of 1 meter with width 0.5f? Should I play with fovy or the aspect ratio? Or should I use glFrustum instead?
One more thing: Can You please tell me why the red plane on the picture covers the white lines, but is partially transparent to the camera image?