glReadPixels() doesnt return exact values

ok, so im now testing my color selection function. the only problem i have now, is that glReadPixels() doesnt return the exact color values. for examp-
i render a square with the color (0.5f, 0.5f, 0.0f). but the function returns (0.5126, 0.5266, 0.0f). what i want to do is, to multiply the 0-1 color values i recive with 255, and than access a table with all the values i need, to do this i need the exact color-values, so the multiply will give an integer for the table index.
what can i do?

Do not read back GL_FLOAT but GL_UNSIGNED_BYTE and do not use HighColor.
Forgot one: Disable dithering.

[This message has been edited by Relic (edited 01-29-2001).]

thank you. i know i need to disable dithering, but should i do that? and how can i not use hi-color?
anyway it works now, thank you very much.

[This message has been edited by okapota (edited 01-29-2001).]

>>i know i need to disable dithering, but should i do that?

For the selection (esp. if that was in HighColor) this is essential, because even with flat shading and no lighting you’ll get different colors on different pixels of the same object if the hardware tries to emulate colors between two pure colors. That is why 24 bit true color is better, usually dithering is not done in true color.

>>and how can i not use hi-color?

Does that imply your board does not support HW OpenGL in TrueColor? Then you’re left with HighColor, otherwise switch the color depth in the display control.

>>anyway it works now, thank you very much.

My pleasure.

ok, but how do i disable this hi/true color mode, i need to switch pixel-format to 24 bit from 32, or something like that?
(i wont do it, since its not a very good thing to do, but im just intrested).

Pixelformats depend on the color resolution of your desktop.

If you are in 16 bit you won’t get HW accelerated pixelformats with 24 color bits offered by the display driver. You’ll probably end up using a Microsoft pixelformat if you insist on 24 or 32 color bits.
In true color the display driver will offer different HW pixelformats. Or none at all if you’re on an older Voodoo board.

Switch the desktop’s color resolution in the control panel and see how the pixelformats change. Use DescribePixelFormat() to enumerate all and look at their formats.

You can also change the display settings in your program (do a search on that in the forum). (Personal note: I hate it when programs change my display settings!)

Tip: Never change the desktop settings while OpenGL apps are running.

Originally posted by Relic:
Personal note: I hate it when programs change my display settings.

So do I ! So please guys, do like NeHe in his tutorials and add a dialog box at the beginning of your programs that would say (“Can I change your display settings to XXX >”). Or do something like Q3 (full settings from inside the program !) or a .cfg file…

Sorry okapota, this had nothing to do with what you asked…