glTeximage2D and grayscale

Hi all,

I am having problems getting my 256x256 unsigned char array to work as grayscale, it keeps coming up as ‘all gray’.

I’ve tried using GL_LUMINANCE and also GL_ALPHA for the parameters to glteximage but it hasn’t worked…

any ideas?

You haven’t given much information to go on. Post some code.

Right, ok then :P…

I have a 256x256 array of unsigned chars, and I try and load it like this (assuming normal gen texures and binding):

glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE, width, height, 0, GL_LUMINANCE, GL_UNSIGNED_BYTE, data);

I have also tried:

glTexImage2D(GL_TEXTURE_2D, 0, GL_APLHA, width, height, 0, GL_APLHA, GL_UNSIGNED_BYTE, data);

But the image comes out all gray. What i actually have in my array is 256 shades of gray from 0 - 255.

Well, I see nothing wrng with the calls, assuming width and height are valid values, and that data actually contains what you think it does.

What about texture coordinates and texture environment setup? You say you get an all gray image. Do you mean a single shade of gray, or different shades of gray (for example, shades between 0.4 and 0.6)? Do you get the same effect with GL_ALPHA as with GL_LUMINANCE?

Although this is unlikely to be the problem, try changing GL_LUMINANCE to GL_LUMINANCE8 in the “internal format” parameter to TexImage. That is, the first use of GL_LUMINANCE.

[This message has been edited by bakery2k (edited 01-29-2003).]

Yep, I get a single shade of gray and its the same with luminance and alpha.

also, bakery2k, I have tried the GL_LUMINANCE8 bit, but that didn’t work.

Could it be my texture mode? I think I am using Modulate most of the time, that should be alright shouldn’t it?

I know the texture coords are right, because if i convert it to rgb it works

I think we need to see more of the source to determine what can be wrong. Make a minimal program that demonstrates this behaviour and, if the program is small, post it here, if not, you can mail it to me (address in my profile) and I will have a look at it.