Strange behavior on 120 Hz monitor

Hello,

I wrote a program for 120 Hz monitor but it doesn’t work very well. For the first 5 seconds, I draw a quad in the center of window, and it reaches 120 FPS and looks as good as other OpenGL program. Then for the other 5 seconds, I draw two quads, one on the left for the even frames, and the other one on the right for odd frames. Then I notice there is a flicker around every one second. It looks very strange to me. Any help is highly appreciated!

Attached is a test app and you may use it to see the artifact I described. Thanks!

Check for all bakcground processes or deamons (even system ones), use glFinish() each frame ?

Hi ZbuffeR,

Thank you for your reply.

I recorded the duration between two frames. Usually it is 8 ms and sometime it increases to 9, 10 or even 11 ms. The increase happens around per second. Thus, I guess maybe the implementation of double buffer is slow in some case. It can achieve 2000+ FPS when single buffer is used.

Now my question is can we use VSync in single buffer mode? Or anything else we can do to improve the frame rate (need to be vertical synchronized as well)? Thanks again!

What are you using for your timing? There may be something off there. In particular if you’re using a millisecond timer your problem might be as simple as 120 doesn’t divide into 1000 evenly and your program and/or driver will eventually need to play catch-up. This may also be true if you’re not running frames on a timer but strictly locked to vsync (via an extension or your driver’s control panel) - your OS and/or driver may be using a millisecond timer unknown to you.

I agree with you. I was using clock() before and it is not so accurate. However, QueryPerformanceCounter(&) gives a similar result.

I thought about timing as well because Qt sends signals to trigger update. To test, I use a while loop to see if the timer is accurate enough. What I used is:

while(1)
glWidget->updateGL();

Before Qt app crushed, I still see the artifact. This means out of sync doesn’t happen in my code. I believe something is wrong with double buffering.

The worse case for me might be coding everything to make synchronization work. If anyone can point out any of mistake I made using double buffering, that would be great. Or some other advice is also greatly appreciated.

I don’t really think you made a mistake using double buffering, but it’s not too easy to tell without seeing code. From your description though everything seems fine. (And it is pretty hard to get double-buffering wrong; you have to really try.)

I think the best thing is to forget about timing frames in your code, set vsync in either your driver control panel or via extension, and trust the driver to do the right thing; it’s quite likely much better at this kind of timing/sync.

One further question: are you calling a sleep function anywhere in your code?

Well, it’s hard to forget it if it doesn’t work as it is supposed to be… I didn’t use Sleep. After turning off VSync the FPS reaches 2000+, so no Sleep at all.

Apparently you did not read it, so I repeat : TRY USING glFinish() after each swap command.

Useful each time determinism is better than performance.

If it still doesn’t help, you can try to make similar things without Qt (try glut for example), maybe you encounter Qt event issues within your program.

Not sure glut would be better than Qt for this :slight_smile:

The one thing GLUT would achieve is to narrow down the list of potential causes; if the problem still occurs there we can rule out something in Qt being the culprit. The same thing would apply to a native application, of course.

Hi ZbuffeR,

I did use glFinish() but it doesn’t help. I was using glut, since I got this strange artifact I turned to Qt to verify if it’s the problem of glut, lol.

I noticed the artifact is not caused by double buffering. I used single buffer to simulate what double buffering is trying to do.

I add some test to code before glBegin( GL_QUADS );

float duration = 0.0f;
while(duration<0.008333f)
{
	QueryPerformanceFrequency(&ticksPerSecond);
	QueryPerformanceCounter(&t_new);

	duration = (float)(t_new.QuadPart - t_old.QuadPart) / ticksPerSecond.QuadPart;
}

Still, I can see this artifact. Now I am wondering if this problem is related to my 120 Hz monitor (Acer GD235HZ), or some timing control in OpenGL APIs.

That’s more or less what I suspect it is. I mentioned millisecond timers above (and the fact that 120 doesn’t divide into 1000 evenly) and I’m reasonably certain this is the cause.