Texture Problems

Hi,

i noticed some strange texture problems when i tried to load a simple gradient texture.
It isn’t displayed properly. It looks like a depth problem, but my application runs 32bpp and when i create a square and assign colors to the vertices to create a gradient it looks excellent.
Also it only seems to be a problem on some machines. I tested it one some others where there was no problem with the texture.
To get an image of the problem I’ve made a screenshot.
It’s available at http://www.ki.tng.de/~fleischmann/gradient.png

On the left is the texture loaded in the application, on the right is the original bmp.

Here is my texture loading code :

glGenTextures(1, &texture);
AUX_RGBImageRec* TextureImage2[1];

if(TextureImage2[0] = LoadBMP(“grad.bmp”)){
Status = TRUE;
glBindTexture(GL_TEXTURE_2D, texture);
glTexImage2D(GL_TEXTURE_2D, 0, 3, TextureImage2[0]->sizeX, TextureImage2[0]->sizeY, 0, GL_RGB, GL_UNSIGNED_BYTE, TextureImage2[0]->data);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
}

And here are some other init options :

glShadeModel(GL_SMOOTH);
glClearColor(0.0f, 0.0f, 0.0f, 0.0f);
glClearDepth(1.0f);
glEnable(GL_DEPTH_TEST);
glDepthFunc(GL_LEQUAL);
glBlendFunc(GL_SRC_ALPHA,GL_ONE);
glHint(GL_PERSPECTIVE_CORRECTION_HINT, GL_NICEST);
glEnable(GL_BLEND);
glEnable(GL_TEXTURE_2D);

I hope someone can help me with this problem

spectrum

glClearColor(0.0f, 0.0f, 0.0f, 0.0f); ?

I use glClearColor(0.0f, 0.0f, 0.0f, 1.0f);

This may help you.

No, that’s not the problem

Looks like OpenGL is storing the textures as 16-bit, or the display is in 16-bit.

j

I had exactly the same problems. But bizzarely enough I think I found that it improved if I set the card to 16 bit and got worse when at 32 bit. This was strange as it was a 24 bit image. And when looking at it in photoshop it was better in 32 than 16 as you would expect. So I can only assume it was a driver/OpenGL bug.

I had exactly the same problems. But bizzarely enough I think I found that it improved if I set the card to 16 bit and got worse when at 32 bit. This was strange as it was a 24 bit image. And when looking at it in photoshop it was better in 32 than 16 as you would expect. I had orginally thought I was making a mess of rescaling a non n^2 texture but now I can only assume it was a driver/OpenGL bug.

In the line:

glTexImage2D(GL_TEXTURE_2D, 0, 3, TextureImage2[0]->sizeX, TextureImage2[0]->sizeY, 0, GL_RGB, GL_UNSIGNED_BYTE, TextureImage2[0]->data);

change the 3 to GL_RGB8 and see if it doesn’t look better.

I have the exact same problem now. I can set the texture depth in control panel->opengl settings
But if i set it to 16 bit i cant control the depth in the application… Even if i use GL_RGB8!!! Its a bit strange. But in quake 3 they seem to be able to override that… how? Maybe you have to change that value in the registry to be able to use 32bits… just a guess. Hope someone knows.

Im sorry i was sloppy, what DFrey says is correct! I got it to work now… Thanx!