Slow rendering under Windows 7 compared to XP


I’ve been developing an OpenGL application for the last year or so. Under Windows XP it runs at ~60 FPS. Under Windows 7, it runs noticeably slower with the frame rate dropping to 10 to 20 FPS. This is on a dual boot system (i.e., the same hardware). Notably, even of far less powerful hardware things still click along at 60 FPS under Windows XP.

I can selectively turn features off in my program, but the problem seems to be very general. That is, turning off all textures does not resolve the problem. Either does disabling all anti-aliasing features.

I have the latest nVidia drivers for my GeForce 7900 GS.

Has anyone else experience a significant increase in rendering time when moving from XP to 7?

Can anyone suggest why I might be experiencing such a decrease in performance?

Any suggests are welcomed. I’m rather at a loss since I can narrow the lost performance down to specific OpenGL calls. From what I can tell, OpenGL is just much slower on Windows 7.


W7 drivers generally handle vsync differently to XP drivers; are you doing anything such as glFlush that might be accidentally triggering a vsync?

I have read about the vsync issues under W7. I don’t call glFlush() anywhere in my code.

I’ve also played around with my video card driver to turn off vsync. This had no impact on performance. The 10 FPS is way below the vsync rate (60 Hz) which makes me believe it is a different issue.

Thanks for the suggestion.

What is the CPU usage of your application? If it’s seriously high, you might have hit a software fallback somewhere.

Thank NeXEkho,

Definitely on to something!

Stressing my program I am hard pressed to get the CPU usage above 20% on XP, but can easily max it out to 50% (i.e., full use of a single core on my dual core system) under W7.

Any advice on how to identify what is resulting in a software fallback?

Is it possible my application isn’t using the nVidia OpenGL drivers at all and that everything is being done in software?

I ask, since the CPU usage is high (above 35%) even for the following simplified rendering loop:




glRotated(m_camera->GetPitch(), 1, 0, 0);
glRotated(m_camera->GetYaw(), 0, 1, 0);

glColor4ub( 255, 0, 0, 255 );
glVertex3f( -4.0f, -0.01f, 4.0f );
glVertex3f( 4.0f, -0.01f, 4.0f );
glVertex3f( 4.0f, -0.01f, -4.0f );
glVertex3f( -4.0f, -0.01f, -4.0f );


Note that SetCurrent() and SwapBuffers() are wxWidgets calls that set the OpenGL context and swap the front and back buffers, respectively.

Thanks for the help. Very much appreciated.


Call glGetString(GL_VENDOR) and print out the string you get. That should tell you whether you’re getting NVIDIA’s driver or not.

Ok… so I do indeed seem to be doing software only rendering. The vendor returned from glGetString(GL_VENDOR) is Microsoft Corporation. This explains so much! :slight_smile:

Now, my obvious question is why? I certainly have the nVidia drivers installed. OpenGL Extensions Viewer indicates I have OpenGL v2.1 and nVidia driver

Perhaps I should move this question over to a C/C++ discussion list. It seems I need to configure my Visual Studio project to somehow locate and load the nVidia drivers.

Do you have any experience with this before I move else where for help?

Thanks for your aid. A huge help!

Now, my obvious question is why? I certainly have the nVidia drivers installed. OpenGL Extensions Viewer indicates I have OpenGL v2.1 and nVidia driver

That means that your OpenGL initialization code is incorrect. The GL Extension Viewer is using its own initialization code.


Thank you for your help NeXEkho and Alfonse. I have resolved the problem. I had a copy of opengl32.dll from XP sitting in the same directory as my executable. As such, it was loading this DLL and not surprisingly failing to find my graphics card drivers. Removing the file resolved the problem.