Create new window in fullscreen - Latency problem

Hello guys,

Long time reader, just joined.
On the current project (in C++) that I am working on, I stubbled upon something really strange.
I am instantiating a new full-screen window using CreateWindowEx(); API from Windows, in order to measure the screen latency between the user input and the result on the screen (in our case, the change between full black to full white screen).

I am using an industrial scope-meter to measure this.
The result I have are strange and illogical. If I instantiate a full screen window with boarders I get a latency lower than if I instantiate one without one.
What’s even stranger, is that if I have a window on top of my current border-less window, the latency is EXACTLY like if I had the boarders turned on. The difference is about 3 frames, which is critical gap for my goal.

I believe it’s because I am not creating my window correctly, as I don’t get the same effect as on PC games, where for example in some of the games I have on my system: Aero is disabled on the window itself but the rest it’s enabled, and when the game looses focus it minimizes automatically.


dwStyle = WS_POPUP;
dwExtStyle = WS_EX_APPWINDOW;

CreateWindowEx(
    dwExtStyle,                       // Extended Style For The Window
    szAppName,                        // Apps Name
    "test",                               // name
    dwStyle | WS_CLIPSIBLINGS | WS_CLIPCHILDREN,
    0,        // Window Position
    0,        // Window Position
    2560,   // Width
    1600,   // Height
    NULL,                             // No Parent Window
    NULL,                             // No Menu
    hInstance,                        // Instance
    NULL);                            // Dont Pass Anything To WM_CREATE


Please read this note:
I know most of you don’t have the equipment I have, and I expect most to not know about the issue, but I am all for ideas, and testing.
What do you think games uses for creating their window? What do you use for your full screen project?

System specs:
> Windows 7 Ultimate 64-bit with SP1, and latest updates installed (just checked)
> Core i7 X980 3.33Ghz (not overclock)
> 6GB of RAM DDR3
> Nvidia Geforce GTX 590 with the latest drivers (just checked)

Big Thanks

I doubt that the window creation is the problem, but here’s what I use:



    WNDCLASSEX wc = { 0 };
    
    wc.cbSize = sizeof( WNDCLASSEX );
    wc.style = CS_OWNDC | CS_HREDRAW | CS_VREDRAW;
    wc.lpfnWndProc = (WNDPROC) WindowProc;
    wc.cbClsExtra = 0;
    wc.cbWndExtra = 0;
    wc.hInstance = hInstance;
    wc.hIcon = NULL;
    wc.hCursor = LoadCursor( NULL, IDC_ARROW );
    wc.hbrBackground = NULL;
    wc.lpszMenuName = NULL;
    wc.lpszClassName = "GLWindowClass";
    wc.hIconSm = NULL;

    RegisterClassEx( &wc );
    hWnd = CreateWindowEx( WS_EX_APPWINDOW, "GLWindowClass", "GLWindow", WS_POPUP | WS_CLIPCHILDREN | WS_CLIPSIBLINGS, 0, 0, 500, 500, NULL, NULL, hInstance, NULL ) ;


Are you using CS_OWNDC in WNDCLASSEX?

Also, how are you creating the fullscreen window - are you using ChangeDisplaySettings with CDS_FULLSCREEN?

Although this is very old, here is a resonable example of creating a window http://nehe.gamedev.net/tutorial/creating_an_opengl_window_(win32)/13001/

Not sure if it helps, but John Carmack has been pondering display Latency issues recently:
http://superuser.com/questions/419070/transatlantic-ping-faster-than-sending-a-pixel-to-the-screen
http://www.pcgamer.com/2012/06/06/john-carmack-is-making-a-virtual-reality-headset-500-kits-available-soon-video-interview-inside/

No real idea what’s causing the border issue to occur, but maybe handling WM_ERASEBKGND and returning non-zero might help in some way:

And you are using double buffering too, aren’t you?

In nvidia control panel, what’s the “maximum pre-rendered frames” setting? 3 is the default.

Thanks guys a lot. I’ll try what was suggested, and get back to you as soon as possible.

@Ilian Dinev, good idea. I have tried setting to 0, and disable triple buffing on the Nvidia Control Panel, and sadly no difference. I have also tried setting all settings to minimum in the Nvidia Control Panel, and also settings to maximum settings, to see if their was any difference. While there is one, it’s sadly very small (almost a half a frame gain). :confused:

I think that Ilian’s on the right path here - GPUs are allowed to get out of sync with the CPU, so you’ll nearly always measure some latency and it will nearly always be something in the order of 3 frames; using a true fullscreen window is very likely going to push this to a more optimized path in the driver too, so it’s not too far-fetched to see even more latency coming from that.

What you can try doing to alleviate this is to sync the GPU with the CPU at the end of each frame - a simple glFinish call before SwapBuffers should be enough. Performance may go down, but if your measured latency also goes down then it’s a good indicator that this is what was happening.

Ok here is so far what we tried:

-We already have both things “mark ds” mentions in our code so that does not help.

-Tried handling WM_ERASEBKGND as “Dan Bartlett” suggested but it did not affect the latency in any tests.

-“mhagain” has an element of truth, by putting the glFinish() before SwapBuffers we drop by a full frame on fullscreen! We are still 1 frame behind the bordered window though… and it still does not explain why we had no latency difference in windows xp…

Hmm, wondering why.

Next, try adding a glReadPixels() reading a small rectangle from somewhere in the default framebuffer. Make the call look valid (as if you’ll really use the data from it).

On a similar note, I fought with a radeon having 30 frames latency this weekend… T_T

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.