glGet macros allways return zero -- even at start

Hello…I’m trying to use the GLM routines in my own OpenGL program. I can compile and link everything OK but when trying to load an OBJ Wavefront 3D model with textures the program crashes when trying to load an image–the debug sequence reports that the max texture size is zero.

glGetIntegerfv(glGetIntegerv(GL_MAX_TEXTURE_SIZE) or glGet anything is ALWAYS returning zero for ANY parameter – even at program start. I’d like to be able to use textures so if you have any suggestions, please let me know. If I run the example program (it’s in C not C++ as my program is) glGet(GL_MAX_TEXTURE_SIZE) works OK and gives a value of 32768.

I’m using the version of GLM from here:

Sounds like you haven’t created an OpenGL context yet.

Just for testing, try this first. Then do your glGets:

  glutInit( &argc, argv );
  glutInitDisplayMode( GLUT_DOUBLE | GLUT_DEPTH | GLUT_RGB ); 
  glutCreateWindow( "window title" );

Yep, that’s what it was…I was trying to load all the 3D model files first – before glutInit. I knew it was something simple – I need to have Occam’s Razor burned on my forhead–thanks :0)