I have the following situation. I am using QT as my windowing system, and am rendering to an off-screen target and slowly building up an image over several frames. So, the idea is: render some geometry to off-screen color and depth buffers (on a QTimer event), and then render that buffer as a textured quad in QGLWidget::paint(). Then render some more geometry to the same off-screen buffer (when another timer event comes in), and then re-render that buffer as a textured quad again. Etc until I run out of geometry. If the camera changes, I clear the buffer and start all over with rendering.
The problem that I am seeing is that, when the camera changes, I see two versions of the geometry. The old one from the previous camera and the new one. It occurred to me that perhaps the GPU/driver is still waiting to render the quad in paint() while the timer event handler is modifying the off-screen buffer. However, I don’t see how that is possible. The two methods (timer and paint) should be called sequentially on the same thread. I verified this by putting a mutex around the two methods. I also thought that perhaps the driver was waiting to render toe textured quad, but I think QGLWidget will do a flush when it is done calling my paint method, correct?
Just for fun, I tried adding glFlush() and glFinish() to the end of my paint method to force rendering of that quad to finish before the timer event has a chance to take over. No dice.
I did put in a thread sleep for 100ms at the end of my paint method, and that makes the problem go away. However, that kinda kills my frame rate
So, what else might be going wrong here? How might I fix this?