I have a wierd issue that Im sure has a simple solution.
Quick history : I usually use lwjgl, but I am converting some stuff to C++.
I have a class § that keeps proper track of my 3 control vectors (right,up,view/forward)
Here is my issue:
I basically have the following display code that looks like :
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
float x= 0; float y = 2; float z = 2; float n =1 ; glBegin(GL_QUADS); glVertex3f( x+n, y-n,z); // Top Right Of The Quad (Top) glVertex3f(x-n, y-n,z); // Top Left Of The Quad (Top) glVertex3f(x-n, y+n, z); // Bottom Left Of The Quad (Top) glVertex3f( x+n, y+n, z); // Bottom Right Of The Quad (Top) glEnd();
Now, when uVec == 0,1,0 everything works fine. That is to say, when ploc.y increases, my box “moves down”. Beautiful.
In addition, I can change my uVec within class p and the view of my box rotates beautifully. It also “rotates” up and down when my view direction changes.
In this example VDir = 0,0,1 (looking straight ahead) and never changes.
It all works.
Here is the issue. When uVec == 1,0,0 (ie, I have rotated my camera 90 degrees), as ploc.y increases, my box does not “move down”. It “moves to the right”. In other words, the gltranslatef part of gluLookat is translating the y coordinate along the up Vector, rather than just translating to the world y location.
This was wierd, so I quickly wrote my own gluLookAt function. I won’t go into the detail, but I basically created the rotation matrix using https://www.opengl.org/sdk/docs/man2/xhtml/gluLookAt.xml.
Again, it works flawlessly with all my rotation tests, but as soon as I do a gltranslatef(), it wants to use the rotational axis for the direction of the gltranslatef rather than the actual real location. In other words, uVec=1,0,0 and any changes to ploc.y makes the box move left and right, not up and down.
The wierd thing is, I do this all the time using lwjgl and it is not an issue. GL11.gltranslatef does not use the rotational matrix to determine where to translate to with gluLookat, or when my make my own matrix.
The only thing I can think of is in lwjgl, I am specficially using glut 1.1 (GL11 calls) where as in C++ on linux, I am using freeglut3 libraries. Is there a setting I need to set to use real coordinates? Otherwise it would be a nightmare to keep track of where your camera is in world space, and where you are viewing from.