Calculate FPS of a scene in OpenGL

Hey guys,

I have succeeded to create a scene in OpenGL and write now i want to count the number of FPS.
I have fund something about this on the internet but i don’t understand why is use that formula. Here is the fucntion:


void numberOfFPS()
{
    frameCount++;
    frame_per_sec_count++;

    if (frame_per_sec_count == frame_per_sec_limit)
    {
        char local_fps[256];
        float ifps = 1.f / (sdkGetAverageTimerValue(&timer) / 1000.f);
        sprintf(local_fps, " %3.1f fps", ifps);
        glutSetWindowTitle(local_fps);

        frame_per_sec_limit = ftoi(MAX(1.0f, ifps));
        ++i;
        fps_cout_simple = frame_per_sec_limit;//fps print
        frame_per_sec_count = 0;
        sdkResetTimer(&timer);
    }
}

Exist some forumla for FPS? Why is used this :


    float ifps = 1.f / (sdkGetAverageTimerValue(&timer) / 1000.f);

Thanks :smiley:

Still i can’t find an answer, has anyone some idea about this?

[QUOTE=Cristi;1287219]now i want to count the number of FPS.
Exist some forumla for FPS?[/QUOTE]

First, FPS is a pretty useless metric. Gamerz use it, but graphics developers don’t. A few short pages on this: link, link.

Why is used this :

float ifps = 1.f / (sdkGetAverageTimerValue(&timer) / 1000.f);

This is apparently asserting that:

ifps = Frames / Second = 1.0 / ( sdkGetAverageTimerValue(&timer) / 1000.f )

so, just do the math:

Seconds / Frame = sdkGetAverageTimerValue(&timer) / 1000.f
(1000 * Seconds) / frame = sdkGetAverageTimerValue(&timer)
(1 * Milliseconds) / frame = sdkGetAverageTimerValue(&timer)

So apparently, sdkGetAverageTimerValue() returns elapsed time for the frame in milliseconds.