Ok, maybe this should be in the beginners section, but I’ve queried the boards and didn’t get much about my specific problem.
Have a medium sized OGL app, net code and all. Running on a GeFroce2 ULTRA with 32/24/8 ( color/depth/stencil ). Only rendering ~800 polys per frame with maybe 8 different textures.
Problem is that in fullscreen mode, I’m only getting 60fps. If I don’t call ChangeDisplaySettings and just “fake” fullscreen, I get the expected ~100fps (refresh rate of monitor).
I’ve profiled the hell out of the app, using both Intel’s GPT and VS6’s profiler. Everything points to the wglSwapBuffer call being the bottleneck. Currently I’m using the win32 SwapBuffers() function.
here’s the DEVMODE code:
dmScreenSettings.dmPelsWidth = displayX;
dmScreenSettings.dmPelsHeight = displayY;
dmScreenSettings.dmBitsPerPel = 32;
Anyone have any ideas what’s causing the screwup? Am I missing something while initializing the DEVMODE structure?