I have very unusual (to me) problem with color buffer.
I have 32 bit color buffer and I clear it and set the forth byte to 1. Then I disable writing to it by
glColorMask( GL_FALSE, GL_FALSE, GL_FALSE, GL_FALSE );
Then I draw a simple geometry. So I do change the depth buffer but I don’t want to change the color buffer in any way.
After this operation I read back the color buffer and the forth byte in the color buffer where geometry changed the depth buffer has changed.
Is that the expected/correct behavior? I would expect that there will be no writing to the buffer if I explicitly disable that operation.
Just to clarify - this happens on MS implementation of OpenGL.
Doesn’t seem right to me, either:
glColorMask specifies whether the individual color components in the frame buffer can or cannot be written. If red is GL_FALSE, for example, no change is made to the red component of any pixel in any of the color buffers, regardless of the drawing operation attempted.
So, it sounds like a bug to me.
However, maybe you can work around it. Are you using blending? Fullscreen antialiasing?
Actually I’m using this 4rth byte to know which pixel in color buffer has changed during drawing.
The problem appears when I draw in the depth buffer just to create some effects latter on with shadows.
Unfortunately for me until now I didn’t find any workaround of this problem.
But it is strange that nobady found this before and force MS to fix it.
I suspect I cannot expect MS to fix it now.
AFAIK, MS OpenGL implementation doesn’t support framebuffer with alpha channel.