I am trying to use ReadPixels to take screen snapshots. The RGB values
returned from ReadPixels are not exactly the same as the RGB values
defined when the picture was originally drawn. I think this is because there
are only 4 bitplanes allocated for each RGB component in the color buffer, so
the RGB values are “rounded” when drawing to give a best match.
For example clearing the window to (1.0, 0 .121, 0.121, 1.0) - using floats, then
reading back the color in ReadPixels gives (255,31,63,255) - using unsigned
bytes. The Blue component should be 31, not 63.
The problem is that the “rounded” values give significantly different
colors when the snapshots are retrieved into other graphics software.
I am running on an SGI Indy. Ideally, I would like to be able to allocate
more bitplanes for the color buffer RGB components, but as far as I
can see, this is not available from OpenGL.
I have considered many options. I can’t use color index buffers
because the windows use widget sets in RGB. The OpenGL SuperBible
describes the use of a PIXELFORMATDESCRIPTOR to reallocate
bitplanes, but this appears to be useful only on Windows machines.
The accum buffer cannot be drawn into (even though it has 16 bitplanes
for each RGB component). Mapping the colors in ReadPixels would not
really accomplish the goal.
Any suggestions would be much appreciated.