I has developing a software with OpenGL for a
while. After the released of my software, I have
received many reports from users about bugs in
Then I tested my software with two difference
system, WinXP and Win2000. The the result are
also difference, WinXP can run witout bug but in
Win2000 the sofware perform some bug the user
were report earlier.
I think that there must be something difference
about coding for Win2000 system. Are there any
suggestion or article about this issue? please
Can you be more precise ? Which bug are talking about really ?
Xp and 2000 are closer than win98 and 2000 for example.
Did users had the latest video drivers for their graphic cards on both systems ? Most of opengl problems come from older/broken video drivers.
It was about the CPU usage. The result of test was obvious that my software using 12% of CPU in WinXP but 100% of CPU in Win2000, same version difference machine, but why?
I have developed the software with VC++, OpenGL and QT.
Have you try to turn on VSync in driver pages?
Anyway… if your loop looks like:
then if you enable vsync, rendering will synced with display refresh and this usually lead to small CPU usage (app sleep in SwapBuffers call).
If you disable vsync SwapBuffers call will immediatly show new image. In this case you app will eat 100% of CPU time.
So… check driver pages setup and turn on vsync, or use glSwapInterval(1) call after creating gl context to control buffer swaping.
I could not find any definition about this
command. Please tell me more.
If Vsync would changing the usage of CPU. So
It will be the same result in XP and 2000,
Sorry… my mistake… Do google for WGL_EXT_swap_control extension. If it’s supported (and usually is supported) map entry point of wglSwapIntervalEXT function. Simple usage is:
wglSwapIntervalEXT(0); //disable vsync
wglSwapIntervalEXT(1); //enable vsync
wglSwapIntervalEXT(2); //show every 2-nd frame with vsync
wglSwapIntervalEXT(n); //show every n-th frame with vsync
Thank you YooYo.
I like the idea of using Vsync On/Off.
It smarts to take the advantage of Monitor
refreshing interval to decrease CPU usage.
I’ll apply the technique with my application soon.
This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.