Gl3.2 and Matrix - am I doing this right?

Hi

I’ve been playing around with GL3.2 and trying to use only the new Core profile stuff.

I’m having trouble understanding how to setup and use matrix information.

I have a window open and can rotate a diamond shape with the code below - but now I want to setup a orthographic viewport mapped to the window size for a 2D game.


     83   glUseProgram(shaderProgram[SHADER_NORMAL].programID);
     84
     85   gld_loadIdentity(m);
     86
     87 //  gld_perspective(m, 45.0f, 1.33333f, 1.0f, 100.0f);
     88 //  gld_orthographic(m, 0.0f, 640.0f, 0.0f, 480.0f, 1.0f, 100.0f);
     89
     90 //  glViewport(0, 0, winWidth, winHeight);
     91
     92   gld_translatef(m,0.0,0,0.0);
     93
     94   gld_rotatef(m, rotateAngle, 0.0f, 0.0f, 1.0f);
     95
     96   glUniformMatrix4fv(matrixLocation, 1, GL_FALSE, m);
     97
     98   gl_getAllGLErrors("updateScreen");
     99
    100   // Make our background black
    101   glClearColor(0.0, 0.0, 0.0, 1.0);
    102   glClear(GL_COLOR_BUFFER_BIT);
    103
    104   glDrawArrays(GL_TRIANGLE_FAN, 0, 4);
    105
    106   // Swap buffers
    107   glfwSwapBuffers();
    108
    109   // Check if the ESC key was pressed or the window was closed
    110   quitProgram = glfwGetKey( GLFW_KEY_ESC ) && glfwGetWindowParam( GLFW_OPENED );


I appear to be getting a proper 3.2 context created


21:57:17 > GLFW: Version 2.7.0
21:57:17 > OpenGL: Version 3.2

How do I setup a orthographic matrix that allows me to use 0,0 top left and 640,480 bottom right?

Thanks

yea go here http://www.opengl.org/sdk/docs/man/ and look up glOrtho and it will say so right there
IT will look something like this

void gldOrtho(float *m, float left, float right, float bottom, float top, float Znear, float Zfar)
{

float m2[16] = {0};

m2[0] = 2/(right-left);
m2[1] = 0;
m2[2] = 0;
m2[3] = -((right+left)/(right-left));

m2[4] = 0;
m2[5] = 2/(top-bottom);
m2[6] = 0;
m2[7] = -((top+bottom)/(top-bottom));

m2[8] = 0;
m2[9] = 0;
m2[10] = 2/(Zfar-Znear);
m2[11] = -((Zfar+Znear)/(Zfar-Znear));

m2[12] = 0;
m2[13] = 0;
m2[14] = 0;
m2[15] = 1;

gldMultMatrix(m,m2);

}

Thanks - the pointer to the man page was what I was looking for.

Am I doing all the matrix things in the right order??

1 - Set matrix to identity to clear it
2 - Set the ortho projection
3 - Multiply matrix by Translate if necessary
4 - Multiply matrix by Rotate if necessary
5 - Upload the matrix to the shader and render

I’m just trying for a 2D game at the moment.

yea that’s about it.