Measuring render time for smooth animation


I’m programming a geometrical visualization system and I’m needing to measure the time it takes to render a 3D scene. The reason for that is because I have a timer to make the camera orbit around an object. Each time the timer signals, the camera gives a step toward its orbitting direction, producing an animation.

To make a smooth animation tough, I need to set the timer’s delay properly. My ideia was to take the time used to render a single frame and use it as the timer’s delay. But any attempt to do so results in the time the system needs to send the OpenGL commands to the video board, not the rendering time itself.

So, how can I do so? Actually, any information about keeping the animation’s frame hate dinamically adjustable according to the hardware and scene’s complexity is very apreeciated. I know this is an issue that every game faces, but I don’t know any tecnique to approach this problem.

Thank you very much!

Fabio Pakk

if you are programming under windows, there is no way to set the timer to the right interval.( every intervall!)… and it is not a good idea to reprogramm timers every frame anyway.
what you should do is: time
2.render first frame. time. get difference to last time
4.use the difference to adjust your animationsteps.
5.render next frame.
6.continue with 3.

or try this:…Timing+In+Games lenethowto09.asp

The way games demos etc do it (pseudocode):

timeStart = getTime();
while(true) {
timeEnd= getTime();
deltaT = timeEnd-timeStart;
timeStart = timeEnd;

If you want to measure precisely each frame, issue a glFinish() before querying time. Framerate may be a bit slower though.

First: for really smooth movement, the eye is sensitive to jitter, so make sure you ENABLE VSYNC.

Second: you could read up on the canonical game loop .