texturemapping prob.

hello

i’ve got a strange problem : it seems that opengl always converts my 32bit textures to 16bit, but i know exactly that i’ve initialized a 32 bit gfx mode (or 24bit, who cares=)
i use the following code to generate/upload the texture :

unsigned char texture[256][256][4];

for (int y=0; y<256; y++) {
for (int x=0; x<256; x++) {
texture[y][x][0]=x;
texture[y][x][1]=0;
texture[y][x][2]=y;
texture[y][x][3]=255;
}
}

glTexEnvi(GL_TEXTURE_ENV,GL_TEXTURE_ENV_MODE,GL_MODULATE);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_WRAP_S,GL_REPEAT);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_WRAP_T,GL_REPEAT);
glTexImage2D(GL_TEXTURE_2D,0,4,256,256,0,GL_RGBA,GL_UNSIGNED_BYTE,texture);

When you simply tell glTexImage2D it has 4 components, GL chooses its own method for storing the texture. So, instead of saying it has 4 components, use a specific internal format enumerant. For example use GL_RGBA8 for 32 bit RGBA.

[This message has been edited by DFrey (edited 09-17-2000).]

seems that my system doesn’t support this…i get the GL_INVALID_ENUM error code.
thanx anyway

Did you do a glBindTexture before the coding? Something like this

GLuint texture

code…

glBindTexture(GL_TEXTURE_2D, texture);

Texture Description…

and glEnable(GL_TEXTURE_2D [I had this problem a lot at one point])

cwhite40