frame rate

What is the correct algorithm for calculating FPS? Is it just incrementing a counter for each time OpenGL renders a frame, and then reset that counter once a second has been reached? That’s what I’m currently doing.

All you need is to calculate the time for one frame, then fps=1/time. Or, you can do as you do now, calculating numbers of frames in one second. They are mathematically identical, and will give you the same result.

Whatever method you choose, fps is always the number of frames divided by the time passed.

[This message has been edited by Bob (edited 04-02-2001).]

All you need is to calculate the time for one frame, then fps=1/time. Or, you can do as you do now, calculating numbers of frames in one second. They are mathematically identical, and will give you the same result.

Hello,

assuming, of course, that the time for one frame in a second is the same time for all frames in that second. What Bob said is… er… correct from that point of view, but not when you consider that the time to render a single frame is not necessarily constant.

cheers,
John

Yes, you are correct. I was trying not to be too complicated.

If you go for counting the frames in one second, you will have a result pretty near the actual average during that second, given you have a somewhat high framerate (15-20 fps and up I suppose) so the time does not go too far away from a second. This method will be more acurate as the framerate goes up, becasue you can count the time closer and closer to a second, but you will only get an average during that second. This will also work even if the framerate is not constant, it will return the average framerate during the time.

Timing the frame and calculating the fps once per frame, you loose precision as the framerate goes up (because of timers with finite precision), but you will have the actual fps directly.