GL_UNSIGNED_INT_8_8_8_8

I am using Visual C++ 6.0 and it seems that the symbol GL_UNSIGNED_INT_8_8_8_8 is not defined. Is this because it is an old version of the libraries? It works for me under Linux but when I try to compile under VC6 it does not know this symbol. I changed it to GL_UNSIGNED_INT but this causes my code to GPF.

Robin

Any ideas anyone?

GL_UNSIGNED_INT_8_8_8_8 is a constant defined in OpenGL 1.2 and later, so that might explain why you don’t have it for MSVC, since it currently only support OpenGL 1.1.

Where and how are you trying to use that constant? Anyways, here’s the value if you really need it.

#define GL_UNSIGNED_INT_8_8_8_8 0x8035

I am using it like this:

glTexImage2D(
GL_TEXTURE_2D,
0,
4,
width,
height,
0,
format, // GL_RGB,
type, // GL_UNSIGNED_INT_8_8_8_8,
(GLvoid *)pixels[frame]);

Is there another constant I can use under GL 1.1 in VC that will have the same effect? In other words how do I specify a packed RGBA bitmap as my texture?

For anyone interested when I changed GL_UNSIGNED_INT_8_8_8_8 to GL_UNSIGNED_BYTE my frame rate went from 5fps to 28fps!!

(Under both Linux and Windows using an ATI Radeon 7200)

Robin.