Hello everyone,
this morning I was programming a little glut program and I wanter to play around with the state retrieval mechanism. So I called glGet(GLUT_WINDOW_DOUBLEBUFFER); which returns 1 for a double buffer and 0 for a single buffer. The thing is even though I initialized my window with glInitDisplayMode(GLUT_SINGLE | GLUT_RGB); glGet return me a 1. Is this a bug or is it that even though I request a single buffer the window system (X window in this case) treats me with double buffers. I had flickering though which suggests that my program was using one buffer. The glut implemantations is one of the latest from mesa (6.x).

You still could get flicker even with double buffer if your scene is complex and your system is too slow or it is being done in software vs. hardware.

really I didn’t know that? But isn’t it strange that though I’m requesting a single buffer I get a double one? And still have the troubles of a single buffer? (I also think that if I request double buffer the flickering goes away, so the system is not that slow) It reminds me of the problem with windows when you do not request a depth buffer but you still get one. It took me a while (and a bit of programming opengl in linux) to figure out that there was a GLUT_DEPTH, for glutInitDisplayMode. I thought at the time that GLUT_DOUBLE | GLUT_RGBA was all I needed for a small application.

I’d also like to add that I use 3 windows in my application, a left and a right one (on the right one I create a subwindow). I don’t know if this has anything to do with it. I just did a compile in windows of another program and tried glutGet(GLUT_WINDOW_DOUBLEBUFFER) and it reported correctly.

[This message has been edited by moucard (edited 02-04-2004).]

I have written openGL programs for both windows and linux using GLUT.

I have to go back a run a few on the linux machine and see if I notice any flicker on the more complex.