[QUOTE]Originally posted by BobH:
I am developing a Win NT application [which spawns] a UI thread to run in the background, then repeatedly:
- generate a scene in the back buffer
- use glReadPixels to get the image from the back buffer
- pass the pixel data to a third-party tool for generating MPEGs
- adjust the position of my “observer”
- repeat until I have “moved” through all the way through my scene
All of this works fine as a background process under normal conditions. My problem is, when the screen saver kicks in, I find that the glReadPixels command is no longer populating my pixel array with ANY data (it doesn’t touch my pixel array and leaves the same image as was there previously).
I am having the identical problem with my MFC app. I am using an Elsa Erazor X^2 card (nVidia GeForce 256 GPU), using what I believe are the latest nVidia drivers, downloaded straight from nVidia’s web site.
I am checking the return code of all the Win32 calls I make (wglMakeCurrent, etc…) + I am calling glGetError() after every GL function call I issue.
What I have found is that my app will run fine for hours on end until I bring up the Windows Explorer and quickly move it around and around with the mouse. And then sure enough, each and every time glReadPixels doesn’t populate my array like it should. In fact, I have noticed that it fills it with some test pattern or something. Kinda looks like the emergence broadcast network pattern you used to see on your TV screens way back when.
So, any ideas? This is driving me nuts!