Application is stuttering on every system...

I see about the same rate as you do. It is perfectly synced against the vertical refresh and is is not a flickering problem or something like this.

Have done this before and it does not change anything. I even used some linear movements which has the same effects.

I have some code and I will upload it later. The problem is that I only have a windows version of the code, so I might have to create a linux version as well.

Reading the clock in animate is conceptually flawed. The angular position should be based on the time the image is displayed, not when it is computed. See if vsync with a fixed time delta removes the “micro stuttering”.

Low resolution of the glutGet(GLUT_ELAPSED_TIME) could also contribute to the problem. Despite your claim, sixteen milliseconds is not nearly enough resolution.

I have seen some flat panel displays and projectors that cause stuttering. They run internally at a fixed rate that differs from the video input signal. They drop or interpolate frames. I have never seen a CRT display with the problem.

Don’t think so as I it is called directly before rendering. I also called the animate method from within the rendering method. This is only a flaw if the animate method is called asynchronous which not even GLUT does.

As I told I even used the number of CPU cycles between two rendering passes to get the exact time elapsed. On a 3.2 GHz you would easily get a microseconds resolution (or even nanoseconds). Weather the timing nor the animation method should be the source of the stuttering.

Could be worth it to try it on a CRT with a VGA input. Such devices should not be even able to interpolate any images. But it would be a kind of horror scenario if our infitec beamers interpolate pictures :wink:

I have uploaded a windows demo at http://download.gadgetweb.de/lesson4/lesson04.zip . It is a modified version of the lesson 4 of http://nehe.gamedev.net that just moves a quad from right to left using windows QueryPerformaceCounter method. Because of using this method the application will stutter on SpeedStepping-Systems which is not the stuttering I want to get rid of. So, if you want to test the application just make sure to disable all SpeedStepping features before.

Ok, I modified this test to enable vsync via wglSwapIntervalEXT and ran it on my Win7/85Hz CRT/Ati 4850/SpeedStep-enabled machine. Result: the only stuttering I can see is caused by the quad moving on non-integer pixel coordinates. Otherwise, the result is buttery-smooth.

I have seen fan speed controls and Intel Ethernet drivers steal milliseconds of CPU cycles on a periodic basis. A few milliseconds of missing time may not cause a dropped frame in your glut application but it may disturb the timing of your animation. Since the video sync timing is fixed your animation delta time should also be fixed. Try logging the animation delta time to a file. Timing variations of a couple of milliseconds are detectable to the eye, dropped frames even worse.

Just to prove your sanity, I could reproduce the studdering with your windows demo.
The effect occured every few seconds (6 - 10 secs).
Running on both an Intel quad core desktop system and a 2x dual core Opteron server system using a single GTX 285.

Both systems are used extensively for broadcast video graphics, SD and HD and there is definitely no studdering in our applications.

So it cannot be a hardware/driver/system problem, I think.

Most of our tools use QueryPerformanceCounter for timing, but I need to double check.

Why don’t you output the timing deltas you calculate as well as the translation deltas per frame. Being synced to vblanc the “studders” should be obvious in the numbers aswell.

We have already done this. We wrote all timedifferences between two frames to a file and checked for unusual values. We even plotted the values using gnuplot. The maximum deviation w/wo vsync was smaller than 30ns.

I did some sanity checks on my own code and have had it reviewed by some collegues. The windows application I uploaded has even not been written by myself. I just modified it to use the QueryPerformanceCounter. And it also stutteres if I just use the basic application without any QueryPerformanceCounter of timer query calls with a fixed rotation or movement using the synced to vblanc as timing basis.

Uhm… so I really have to find a system using ATI-hardware for rendering to check it. If I am able to get it running smooth on ATI-hardware using an other graphichs card would be a first workaround.

FWIW. Last night I wrote a simple GL program in connection with another thread on the forum and, lo and behold, got stuttering. Never seen it before. In my app. a circular region is supposed to move with constant speed, bouncing when it hits the edge of the window. It’s a very simple, 2D application. The motion is not smooth. About every second, it halts and jumps a little. I’m on an ATI Radeon X300 card and Windows XP. For me the stuttering goes away if I disable hardware acceleration, or if I enable vertical sync. I know you said that enabling vertical sync didn’t help. Are you sure that it remains enabled when you run your application? Tomorrow, when I get back to work I’ll try my app on an NVidia card. Will be interesting to see if I still get the stuttering.

Absolutely sure as enabling vsync drops the frame from over 10.000 fps down to about 60 fps. The SDL version of my application also shows weather it was able to enable using the vsync or not and setting vsync returns a positive response.

i can report serious stuttering on the quadro line of cards at the moment. This is most evident under extremely heavy API load. In other words, when lots of uniforms are being updated for many batches, when the batches are in display lists. When I say serious, I mean 5 second stalls when the load suddenly increases (i.e. when you turn the camera to face the model and there’s a sudden jump in API calls in a frame). Same thing happens when the load is reduced suddenly (i.e. turning the camera away from a heavy model).
After the stall it runs smoothly, until the load is changed suddenly again. It happens constantly, not just when the model is first viewed. It’s to do with change in work load (presumably CPU workload in the driver thread). What is more telling is that during the stalls there’s a constant stream of 100’s of page faults shown in task manager for the application rendering the model. When the stalling stops, the page faults stop.
5 second stalls are not acceptable. This occurs on Quadro cards from the 3500 up to the 5800, on both XP (32 and 64 bit) and Vista/w7, on dual and quad core CPU’s, on 2GB memory and 8GB memory systems. It happens everywhere. It also happens in other third party applications rendering the same kind of scenes.

Check the amount of texture, VBOs, etc. you’re using. If you’re blowing past the amount that comfortably fits in GPU memory then such stalls are expected, particularly when you turn the camera. Though 5 sec sounds a bit long. Also, you do know you have to prerender to force the textures/etc. onto the card, right?

This may be the dynamic shader recompilation when uniforms change annoyance, but that was mostly a pre-GeForce 8 thing IIRC.

BTW, shouldn’t this have been a new thread?

no textures, no VBO’s, memory not the issue (one of the cards has 4GB onboard, but doesn’t matter anyway as the biggest scene i’ve tested is only 250MB in size).
The bit about pre-rendering - yes i now about that, but you obviously missed the bit in my post where i said it was consistently stalling for 5 seconds throughout the run, not just the first time.
The only shader uniform being changed is the current modelview matrix (via glLoadMatrix). This is using the fixed function pipeline and using a shader that uses the built-in state (alternating between the two code paths at build-time, not run-time). I’ve basically traced the problem to glLoadMatrix, but seeing as though that’s just a uniform under the hood, I imagine it happens for any 16 float uniform changed at that frequency…although I haven’t checked.

You’re absolutely right - this perhaps deserves a new thread.

Dark Photon, i’ve just moved the thread to one titled “Quadro page faulting for 5 seconds all the time”.
http://www.opengl.org/discussion_boards/ubbthreads.php?ubb=showflat&Number=273623#Post273623

Don’t think so as I it is called directly before rendering. I also called the animate method from within the rendering method. This is only a flaw if the animate method is called asynchronous which not even GLUT does.
[/QUOTE]

No he’s right it is flawed but it’s usually not a major issue, the time used is based on the past frame the time in future cannot be known. You can low pass the timer, you can run simulation asynchronously and then render the latest frame at any given time, but you cannot see into the future and predict teh time this frame will take to display and where things should be by then.

This is probably not your issue though.

Okay, I see. I think that this flaw could lead to very odd behaviour in the case of enabled wait for vsync. But, as you already said, I think that this flaw is not my problem as it also occurs with vsync disabled.

But it might be worth it to meassure the time difference between the animation time and the termination of the flushing method. If there are any bigger differences that might cause in stuttering. I will just see if this is the case.

that’s a scary picture, dorbie!

Hey fgreen, did you get any further with this problem?

No. I have not been able to fix this issue and I do not think that there is a way to cope with this problem from within my application.