Generic GDI and alpha band

I wanted to see if anyone else has seen this behavior with OpenGL under Windows.

I have been reading RGBA pixels from the back buffer using the glReadPixels function. When my app runs on a machine that uses the generic GDI OpenGL implementation, the alpha value always comes back as 255.

However, when I run the app on a machine that does not use the generic GDI OpenGL implmentation, the same ReadPixels call comes back with the correct alpha value.

This is what is confusing…I know that alpha blending works for my app regardless of what implementation of OpenGL the machine has (generice GDI or other). How is alpha blending working on machines that use the generic GDI implementation?

I don’t have a real answer, this is just a guess:

probably that the software implementation doesn’t keep the alpha in it’s buffer. Have you tested blending functions that use the destination alpha as parameters??

[This message has been edited by Gorg (edited 03-28-2000).]