A very very strange question!

I found a very strange problem when applying a texture onto a quad.
First I generate a texture in the initialize function:

//read data from convert.img
read_img("convert.img", texture_data, 256, 256);
// create opengl texture object
glGenTextures(1, &id);
glBindTexture(GL_TEXTURE_2D, id);
glTexImage2D(GL_TEXTURE_2D, 0, 1, 256, 256, 0, GL_LUMINANCE, GL_FLOAT, texture_data);

Then in my reshape function, I set the projection matrix:

void myReshape(int width, int height)

   glOrtho(-256,256,256,-256, -4000, 4000);


and in my display function, I draw a quad and applied a texture onto it. To test whether what I did is right, I read the values from the framebuffer using glReadPixels:

void myDisplay()


glBindTexture(GL_TEXTURE_2D, id);

glTexCoord2f(0,0); glVertex3f(-128,-128,0);
glTexCoord2f(1,0); glVertex3f(128,-128,0);
glTexCoord2f(1,1); glVertex3f(128,128,0);
glTexCoord2f(0,1); glVertex3f(-128,128,0);


int		nWidth = 512;
float* pImage = new float[nWidth*nWidth];
glReadPixels(0,0, nWidth, nWidth, GL_LUMINANCE, GL_FLOAT, pImage);
imdebug("lum b=32f w=%d h=%d %p", nWidth, nWidth, pImage);
delete[] pImage;

But the data retrieved from the framebuffer is not the one displayed in the window. It is just similar with the image displayed in the window.
This is the image displayed in the window:

This is the image retrevied from the framebuffer:

I have tried to solve this problem for two days, but I found nothing wrong with my program :confused:
I’ll be appreciated for any suggestions!
Thank you!

It’s unclear how you drew the second image. Check the glTexEnv setting and/or the current color. GL_MODULATE is default, set it to GL_DECAL while you draw your images.
The glReadBuffer call is not in the given source excerpts. If you didn’t issue it, your glReadPixels call has read the data from the back buffer after you have swapped the buffers, this is at least unintended and a real error if the pixelformat has the PFD_SWAP_EXCHANGE flag set.

The first image is got by copying the data in the framebuffer to a texture and displayed by imdebug. And the second image is got by read the data using glReadPixels.
Dose the setting of the current color will affect the results? and how?
Thank you.

The specification says that when converting RGB to luminance (which is what you’re reding back), the resulting luminance value is the sum of the RGB components. If you, as in your example have a gray image with a RGB level of about 35% per component. Summing them for all three components the final color gives you a grayscale level of >100%, which is pure white.

If you want to convert the image to grayscale, you better weight the color components before summing them. You can do that with glPixelStore. Common weight factors based on the human eye is RGB = (0.30, 0.59, 0.11), of if you just want the average, RGB = (0.33, 0.33, 0.33).

glPixelTransferf(GL_RED_SCALE, 0.30);
glPixelTransferf(GL_GREEN_SCALE, 0.59);
glPixelTransferf(GL_BLUE_SCALE, 0.11);

This way you get a grayscale image with about the same intensity as the corresponding RGB image.

Thanks a lot! I’ve solved this problem according to Bob’s suggestion.
To Bob:
But I have another question. I didn’t call

glPixelTransferf(GL_RED_SCALE, 0.30);
glPixelTransferf(GL_GREEN_SCALE, 0.59);
glPixelTransferf(GL_BLUE_SCALE, 0.11);

before and still got a correct result until I modified the source code the day before yesterday. But now I don’t remember where I modified the code :confused:
so I think it’s a surprising problem.

If you didn’t have this problem before, then you probably read back the image as an RGB image, not luminance image. In glReadPixels, you can change GL_LUMINANCE to GL_RGB, allocate three times the amount of memory for pImage, and change the call to imdebug to treat the image as RGB instead of luminance (if possible, I have no clue what it actually is :rolleyes: ).