Bizarre performance issue...

Hi, I’m hoping somebody can point me in the right direction here. I’m working on an OpenGL program and I’m getting strange results from my performance tests that make no sense.

My primary machine (Athlone K7 1800mhz, GeForce ti4200) is spitting out ~60 fps, even while it’s doing nothing more than SwapBuffers each frame. Rendering geometry makes almost no dent on this (up to a point of course) which was my first clue.

Testing the exact same application on 900mhz machine with an old TNT2 produces ~75 fps, also while doing nothing. A thousand triangles or so brings this down to 60 fps.

There is nothing else of note happening in the main loop… if I remove any OpenGL rendering the fps skyrockets to 300k, so it’s not that. All tests are running at 640x480x32. Fullscreen vs windowed makes no difference. I’m using SDL to set up the context, but timing the basic NeHe application also produces equivalent results. In contrast Quake 3 Arena at 1024x768x32 spits out a constant 90fps.

I’m stumped. Any suggestions would be welcome!

It’s called vsync which means your GPU is in sync with the monitor refresh rate.

If the monitor refresh rate is set to 75Hz you will never get more then 75FPS if vsync is enabled. So in your case you get only 60Hz on your main computer because of an bug in W2K/XP thats limit OpenGL to 60Hz in fullscreen modes.

Either disable vsync (a driver setting) or find a “refresh rate fix” using google.com

Thanks. I actually suspected this but I didn’t realise OpenGL internally used vsync. One thing that threw me off was that my main machine’s refresh rate is 72hz not 60hz. Anyway I’m glad to report I found the extension to disable this and things are now running at a more acceptable 450fps.

I recently had a similar problem. My windowed version was running at ~75 FPS and the fullscreen version was running at ~60 FPS. You can override the “default” Windows settings using the DEVMODE structure (I think the element is called dwRefreshRate.) This methodology can be applied if you enumerate all display modes and pick the one with the highest refresh rate. BTW, this is my opinion, but that bug was probably done on purpose, to make DirectX applications run faster by default than OpenGL ones, proving DX was better than OpenGL. ^^

[This message has been edited by Nychold (edited 02-27-2004).]

I had the same problem myself once, but then I realised that I accidentally left vsync on in my Nvidia drivers. Sometimes you can’t always blame MS

Originally posted by phogan:
Thanks. I actually suspected this but I didn’t realise OpenGL internally used vsync. One thing that threw me off was that my main machine’s refresh rate is 72hz not 60hz. Anyway I’m glad to report I found the extension to disable this and things are now running at a more acceptable 450fps.

I’m interested to know how you disabled it. Thanks !

I’m interested to know how you solved it. Thanks !

There’s an extension for windows and SGI (not sure about X) that lets you set opengl’s swap interval, ie. how many vsyncs to wait for. Setting it to zero turns off any vsyncing. If you’re using GLee you can throw in the following.

if (GLEE_WGL_EXT_swap_control)
{
wglSwapIntervalEXT( 0 );
}

GLFW has:

glfwSwapInterval( 0 );

More information at: http://oss.sgi.com/projects/ogl-sample/registry/EXT/wgl_swap_control.txt