I have a program which uses openGL to accelerate the display of a large bitmap rotated in 3d, in a subcontrol of the UI.
Creating the ogl context, the rendering, etc is all done in a spawned thread. However, when a user connects to the PC while the app was running, we need a way to cleanly restart the openGL thread.
What we were doing was waiting on the WM_WTS_SESSION_CHANGE message, then sending a signal to the render thread telling it to stop and kill the rendering context. Then we destroy the window, create a new window, and spawn a new render thread, which then uses the software renderer.
However, using nvidia cards, this leads to massive system slowdown after switching from local -> remote, then back to local. (Traced to the SwapBuffers call).
Using an ATI card with newest drivers, the call to SwapBuffers hangs when the remote desktop connection is created, not allowing us to cleanly terminate the worker thread. If we log in locally, then the call completes.
Does anyone know if this is even the correct method of dealing with remote connection issues, or how to ensure the system performance doesn’t degrade?