The Problem of Blend and readpixels.


I blend two cubes (with seperate alpha values) using OpenGL, the alpha value of the upper cube is 128(translucent), the bottom one is 256(opaque). They display correctly on the screen. Then I wanna read opengl buffer using the following code segment:

glReadPixels(…, GL_RGBA, …,buf);

But I found that the alpha value of the pixels are not correct. I mean, the alpha value of the blending position is a value between 0 and 255, but not 255(the value I expected).

What’s the problem? (I using WinNT,VC6,and let OpenGL draw to window.)

Thanks in advance for any help.

This may be an obvious question, but are you requesting and getting an alpha buffer when the pixel mode is set?

As either you nor your code say anything about it, I have to ask: Are you sure you are requesting the value for the correct pixel?

Remeber that the coordinate you send to glReadPixels is NOT the same as the windowcoordinate! In the window (on the desktop), the y-axis is running from top to bottom, but in OpenGL, it’s running from bottom to top.

OpenGL_y_coord = window_height - window_y_coord

Some more thoughts.
Opaque is 1.0 or 255 , not 256. Though it would be clamped to 255 if not given as byte value.
The alpha value at the blended positions should not be 255, because it was not drawn with alpha=255(?)
What is your glClearColor() set to?
When do you read from the back buffer, after the SwapBuffers()? What flags has your PIXELFORMATDESCRIPTOR.dwFlags field set? If there is PFD_SWAP_EXCHANGE set, the buffer contents after SwapBuffers() are undefined for the back buffer.
Do you have a destination alpha buffer? HighColor resolutions don’t work here.

Hope that helps.

Perhaps it is overdraw? If all the faces of the cube are translucent, all of it is actually going to be covered by at least two polygons (unless you turn off both-face drawing, i e turn on back-face culling).