nVidia: Chging GDI Window Size Clears Back Buffer?

This is kind of an advanced newbie issue…

I have been having a heck of a time with a threaded implementation in which a rendering thread is given a parameter block and set off to render into the back buffer of a double-buffered window.

Everything works great except that during window resizing VERY occasionally I found that the GDI (main application) thread was somehow causing the back buffer to be cleared right during rendering (I was right in the middle of a glCallList call in the rendering thread, and only the second half of the rendering would show up).

This was happening with multiple (and very different) nVidia drivers on accelerated systems, and with the ancient GDI Generic 1.1 OpenGL implementation in Windows - but NOT with ATI accelerated systems.

I worked around the problem by protecting the glCallList through SwapBuffer calls with a mutex, making sure the GDI thread can’t issue the DeferWindowPos / EndDeferWindowPos calls until the rendering thread is clear of the rendering and buffer operations. This is actually reasonably acceptable because resizing operations are not really “normal” operation per se, but are an exception with my application. Normally users will resize the window, then interact with the display.

My theory: Changing the window size instantly resizes and clears the buffers, including the back buffer, associated with the window on some systems. This coupling was unexpected.

My question is this:

Is this GDI - buffer coupling a well-known thing?

Documentation seems virtually nonexistent regarding this part of the OpenGL implementation. I guess it makes sense in a way, because the buffers have to become the size of the Window at some point (I had naively thought it would be during glViewport), and it’s interesting that ATI doesn’t seem to be burdened by it.

I’d be grateful for any insight, and especially a pointer to documentation I haven’t been able to find that describes this part of the implementation.


Warms my heart to know I’m on the bleeding edge of developing with OpenGL.

Isn’t anyone else resizing OpenGL windows here? Is it only just about full-screen games?



Isn’t anyone else resizing OpenGL windows here?

Yes. But when the rest of us detect a resize of our windows, we also redraw the scene. So if there is any clearing, we don’t notice.

I’m doing that. The key is that I’m not rendering in the GDI thread, but in a separate thread, and the glitch becomes visible on the screen even though I’m not swapping the buffer into the foreground. Honest!

The mutex is required to keep the GDI from doing things to the OpenGL buffers, both of them. Only problem is this momentarily slows the GDI thread down, making window resize operations less smooth.

Without the mutex, after the glitch once the redraw has been done without being interfered with, the display is back to normal.

And so I’m faced with either smooth window resize operation with flashing display (basically it briefly goes black with only some of the rendered effects visible), or a less smooth resize with no visual discontinuity.