Using glMatrixMode(GL_COLOR)

I have an app that has an option to draw 3D fractals in stereo as red/cyan color anaglyphs. In a red/cyan anaglyph, you wear colored glasses that show red color values in one eye, and green/blue (cyan) color values in the other eye. Your brain integrates the image in 3D.

Currently it uses glColorMask to draw the left eye view in only red, and the right eye view in only cyan (green & blue.) This works fairly well, but suffers if the source image contains highly saturated reds, blues, greens, or cyans. With those colors, one eye sees the saturated color, but it appears black in the other eye because the color values for that eye are zero. There are also problems with ghosting depending on the colors used. Thus the user has to adjust the colors of the image to make an image look good as an anaglyph.

I found an article on the web that describes applying a different matrix transformation to the view for each eye that shifts the colors for the left eye towards red and the colors for the right eye towards cyan, and improves ghosting.

I modified my app to use glMatrixMode(GL_COLOR) to apply a matrix to the colors used for the left and right eye image. It didn’t work.

I found a post on these boards that says that the GL_COLOR matrix only works on pixels, not geometry.

I think the solution to this is to draw my left eye view into the color buffer in full color, then set the matrix mode to GL_COLOR, set the color matrix for the left eye, copy the color buffer to the accumulator buffer; draw my right eye view to the color buffer, set the matrix for the right eye, add the color buffer to the accumulation buffer, then copy the accumulation buffer back to the color buffer.

Am I right in assuming that the glAccum command uses the value in the GL_COLOR matrix to apply the accumulator to/from the color buffer?

If the ARB_imaging extension is supported, GL_COLOR is also accepted…

Have you checked if this is available on your hardware? To be honest I have never used it… Also AFAIK (on Apple hardware definitely) the Accumulation buffer is not in hardware as it is considered deprecated now that shaders have more efficient and flexible ways of doing the same things…

Others may be able to give you more info but IMO you would be better looking at some kind of shader based method to do this.


Yes, I’m checking for the imaging subset by looking for the string “GL_ARB_imaging”. It’s supported, at least on my development machine. My code is written to fall back to the old way of doing it if the imaging subset is NOT supported.

This is a Mac application, so your comment about the accumulation buffer being deprecated on Macs is very relevant. A couple of follow-up questions:

Where would I read about that, especially Apple’s removal of support for that particular feature?

Also would it be possible to use the shader language to do only the color matrix manipulation, while using the fixed pipeline for everything else?

It looks to me like converting fixed pipeline code to use the shader language is a pretty big (very big) job. I haven’t tackled the shader language at all yet, and it seems to have a pretty steep learning curve associated with it.