Application is stuttering on every system...

Hi there,

I have developed an application that should be used as a part of a research project. The application shows some spheres rolling along within a room. For this research project it is really important that all objects are moving very smooth.

My problem is that the objects within the scene are all stuttering every some seconds and I have no clue why. As we were not able to find any issues about the code I wrote a simple application just rotating a single triangle on the screen. I have tried this on a GLUT and a SDL rendering canvas and I could see the same behavior in this application. The triangle seems to get a little slower or faster for some fraction of a second and continues normally. I tried the application on several systems including Windows and Linux on different graphic cards (all nVidia devices) and I could see the same behavior on every system. I even tried it on a NVIDIA’s Quadro FX 5800 graphics card where it run (vsync disabled) at about 10.000 fps. I have tried to enabe or disable the vsync but it did not change anything.

We excluded CPU switches as the problems source by fixing the process to a single core. The system we used for testing does not have any SpeedStepping or other stuff enabled. As we did not trust in the real time clock on the system we have even read the CPU steps between two frames directly from the CPU which hasn’t had any effect on the problem.

As I told before the vsync on the system was enabled. To exclude this as a problem we also disabled the vsync which did not change anything. We sampled the number of CPU cycles between two frames but we were not able to recognize any major variations. All variations were smaller than 4 ms on all systems.

So I have no idea where to search for a solution anymore. Anyone here having any idea why I get those stuttering movements?

The amount of movement of you objects IS proportional to the time that has passed, no? Your descriptions sounds, as if you were doing fixed movement-steps each frame which yields to varying movement when the frame-rate isn’t stable.

Maybe a video could help to show the exact behavior.

Jan.

No, I do of course not use fixed movement steps per frame but use the system time in milliseconds as the base for the object movements. The other thing I tried is reading the number of CPU cycles between two rendering passes and devide it by the number of CPU cycles per second the machine performs.

So, in my eyes I use a very exact method for timing the object movements. I use the time within kinematic equitations which means, that I calculate the objects position absolute from their origin at scene start.

I have uploaded a copy of my current testing code to http://download.gadgetweb.de/SdlTest/SdlTest.tar or you can just take a look at the .c file at http://download.gadgetweb.de/SdlTest/main/main.c. I have played around with the code a little bit so it might contain currently some flaws but the stuttering effects are still there.

Edit: A video of this problem would be useless as this effects appear over a time of a fraction of seconds which means that encoding the captured data to a downloadable size would destroy this effects.

Try to stick a glFinish() right after SDL_GL_SwapBuffers();
It lowers performance, eats 100% cpu for almost nothing … but reduced stuttering each time I used it.
glFlush() is less extreme, but somewhat less useful.

Thank you for your advice. As I am rendering only a very low number of vertices and have vsync enabled this is not the issue. I added a glFinish() but the problem does still exist.

To me it seems that it more likely is a problem with the driver or graphics hardware. I colleague of mine tested the application using software rendering and it seems not to show the stuttering anymore.

I have not tried to run the application on ATI hardware yet, so I will try so find a way to access a system with an ATI instead of a nVidia card.

Your test program works perfectly fine, without stuttering, as it is on my system. Are you using the latest SDL, version 1.2.14? I know they fixed a lot of speed problems on Windows Vista/7.

Oh, I should have told that I tested it on Windows XP and Windows 7. I am also using the latest SDL version but it definitely is no SDL issue as it occurs under GLUT and plain GLX as well. Are you really sure that it is running all smooth. The stuttering is really hard to see. You can see it looking at the edges of the cube.

Which graphic card do you use?

The actual application has been under development for about four month now and I have not realized the stuttering during this period of time. I have tested the application using a stereo projection and headtracking during this period of time. The headtracking increased the motion complexity and it becomes nearly unfeasible to see the stuttering. But I recognized it when I used a normal two dimensional projection as the range of the stuttering was much bigger than on a normal monitor.

We even tried to use the GLX based version of the first tutorial examples at http://nehe.gamedev.net/data/lessons/lesson.asp?lesson=04. After changing the axis of the rotation to z and setting down the speed we were able to see the same stuttering. I even see this effects in games with simple trajectories and no camera movement like supertux. I cannot tell if this also occurs in complex commercial 3D games as they use complex camera movements which makes it impossible to see this effect.

All I really know about this problem is that it is not an issue of SDL, GLUT or any Library, and should be no CPU or System timing issue and that it also exists on high performance graphic systems.

Edit: Sorry, I use the latest SDL.

Two days ago, it took me half an hour to realize, that my suddendly badly stuttering application was thwarted by Google Chrome diligently displaying the Burger-King Flash webpage in the background, which had been eagerly visited before and wasn’t closed.

Burgers are a scourge of humanity.

CatDog

Hehe, okay…
I wish fixing my problem would be as easy as closing a browser window :wink:

This stuff is driving me crazy. Well, I can see the same stuttering in nexuiz if I do not touch the mouse and walk straight forward. Does anyone know if there are any issues with the nVidia implementation of OpenGL? May be I should test the same stuff using Direct 3D to see if I get the same problems then.

I have no experience with SDL whatsoever, but I wonder if the culprit is the resolution of SDL_GetTicks? If it uses GetTickCount internally, maybe try timeBeginPeriod(1) and then timeEndPeriod(1).

Better still, try QueryPerformanceCounter in windows.

Also, have you verified in taskmanager that no background processes are chewing up cpu cycles?

This was my first idea. I have tried most stuff to fix this issue on linux as I have more possibilities to see what the system is actually doing.

The timers in Windows have a resolution of 16 ms which means that you should not see any stuttering. Anyway: As I told I totally mistrust the system timers so under Linux I used some inline assembly to count the CPU ticks between the two calls of my rendering method. As the system does not have any SpeedStepping or other stuff changing the CPU frequency during runtime this should be the best way to get exact timing information.

Yes, I have done this. There are no other processes running that consume such many CPU cycles that it should lead to stuttering. And the Linux system I have used during my last tests is a Xeon 3.2 GHz Dual Quadcore system and I reserved a single core just for my rendering application and used that special core to receive its number of CPU cycles for my timing calculation. I just can’t be an issue with my time values I measured. The method I use to read the cycles has been used in an other research project that needs really exact time values in a range about 100 ns for force feedback calculations. So this method has excessively been tested and used before. Btw.: In feedback applications they need a force resolution of about 1 kHz as the touch receptors in our body do react within this frequency which means that it is much faster than the 25 Hz our eyes do and they do not have any stuttering effects with the time values measured this way.

some nvidia cards have an issue with severe stuttering performance if u have nview enabled

This problem occurs both with opengl + d3d programs

if u have that enabled, disable it and rerun your program

In some cases OS kernel can change CPU core (in multicore CPU’s) which execute your app. Couple years ago there was an issue with this because cores in multicore CPU didn’t have synced their clocks. Even negative delta time can occur.

There is fix for XP. This issue is resolved in newer mobo BIOS.

No, I do not use nView on any system.

To exclude any timing differences or delays on CPU core switches we forced the application to run on only one single specific core. And the problem is not related to Windows XP but exists under Linux as well.

I modified your code to use glut instead of SDL but still uses its low-resolution timer. I do not see any stuttering with this. Maybe you could recompile this on your machine as a test of your GL drivers


gcc main.c -lGL -lglut

Does it still stutter? The reason I tried this is to separate your GL drivers from your window tool.

BTW, what does glewinfo report back to you about the driver being used?


/*
 * main.c
 *
 * This application may be used and modified in the terms
 * of the GPL v2 license.
 *
 *  Created on: 26.01.2010
 *      Author: Falk Garbsch
 */

#include <GL/glut.h>
#include <GL/glu.h>
#include <GL/gl.h>
#include <stdio.h>
#include <math.h>

#include <sys/time.h>
#include <stdint.h>
#include <unistd.h>

double angle;
double translate;
long ltime;
double ntime;
long count;

//FILE* fout;
//long  tfcnt;

void prepare() {
	glMatrixMode(GL_PROJECTION);
	glLoadIdentity();
	glFrustum(-0.5, 0.5, -0.375, 0.375, 1.0, 100.0);
	glMatrixMode(GL_MODELVIEW);
	glLoadIdentity();
	gluLookAt(0, 0, -5, 0, 0, 0, 0, 1, 0);
	glEnable(GL_DEPTH_TEST);
	glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT | GL_ACCUM_BUFFER_BIT);
}

void drawcube() {
	glBegin(GL_TRIANGLES);
		glColor3f(1.0, 0.0, 0.0);
		glVertex3f(-0.5, -0.5, -0.5);
		glVertex3f(0.5, -0.5, -0.5);
		glVertex3f(-0.5, 0.5, -0.5);

		glVertex3f(0.5, -0.5, -0.5);
		glVertex3f(-0.5, 0.5, -0.5);
		glVertex3f(0.5, 0.5, -0.5);

		glColor3f(0.0, 1.0, 0.0);
		glVertex3f(-0.5, -0.5, -0.5);
		glVertex3f(0.5, -0.5, -0.5);
		glVertex3f(-0.5, -0.5, 0.5);

		glVertex3f(0.5, -0.5, -0.5);
		glVertex3f(-0.5, -0.5, 0.5);
		glVertex3f(0.5, -0.5, 0.5);

		glColor3f(0.0, 0.0, 1.0);
		glVertex3f(-0.5, -0.5, -0.5);
		glVertex3f(-0.5, 0.5, -0.5);
		glVertex3f(-0.5, -0.5, 0.5);

		glVertex3f(-0.5, 0.5, -0.5);
		glVertex3f(-0.5, -0.5, 0.5);
		glVertex3f(-0.5, 0.5, 0.5);

		glColor3f(0.0, 1.0, 1.0);
		glVertex3f(0.5, -0.5, -0.5);
		glVertex3f(0.5, 0.5, -0.5);
		glVertex3f(0.5, -0.5, 0.5);

		glVertex3f(0.5, 0.5, -0.5);
		glVertex3f(0.5, -0.5, 0.5);
		glVertex3f(0.5, 0.5, 0.5);

		glColor3f(1.0, 0.0, 1.0);
		glVertex3f(-0.5, -0.5, 0.5);
		glVertex3f(0.5, -0.5, 0.5);
		glVertex3f(-0.5, 0.5, 0.5);

		glVertex3f(0.5, -0.5, 0.5);
		glVertex3f(-0.5, 0.5, 0.5);
		glVertex3f(0.5, 0.5, 0.5);

		glColor3f(1.0, 1.0, 0.0);
		glVertex3f(-0.5, 0.5, -0.5);
		glVertex3f(0.5, 0.5, -0.5);
		glVertex3f(-0.5, 0.5, 0.5);

		glVertex3f(0.5, 0.5, -0.5);
		glVertex3f(-0.5, 0.5, 0.5);
		glVertex3f(0.5, 0.5, 0.5);
	glEnd();
}

void render() {
	glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

	glPushMatrix();
	glRotatef(angle, 0, 0, 1);
	drawcube();
	glPopMatrix();

  glutSwapBuffers();
}

void animate() {
/*	double tstep = stop_timing();
	ntime += tstep;
	start_timing();*/
	double ntime = glutGet(GLUT_ELAPSED_TIME);
	double fps;
	//tfcnt++;
	//fprintf(fout, "%d %f
", tfcnt, tstep);

	if (ntime - ltime > 5000) {
		fps = (double)count * 1000.0 / (double)(ntime - ltime);
		printf("fps: %f
", fps);
		ltime = ntime;
		count = 0;
	}
	count++;

	angle = (double)ntime / 7000.0 * 360.0;
	translate = sin((ntime / 5000.0)) * 3.0;

  glutPostRedisplay();
}

int main(int argc, char** argv) {
	count = 0;
	angle = 0;
	translate = 0;
	ntime = 0;
	//tfcnt = 0;

  glutInit(&argc, argv);
  glutInitDisplayMode(GLUT_DOUBLE|GLUT_DEPTH);
  glutCreateWindow("Multipass texturing Demo");
  glewInit();

  glutIdleFunc(animate);
  glutDisplayFunc(render);
  //glutKeyboardFunc(keyboard);

	prepare();

  glutMainLoop();
  return 0;
}

Also I compiled and ran your SDL code on my Ubuntu 9.10 32bit ASUS 9600GT with NVIDIA 195.17 binary driver installed. And just like the glut code in my previous post it ran fine without any stuttering effects at all.

What is your hardware setup and which GL driver do you have installed in Linux? MESA? NVIDIA 19?.???

maybe some desktop compositing enabled (compiz etc.)?

Sorry, I have been busy for the last days…

First I would like to thank you for your replies.

It should not be any composite issue as Windows does not have any composite extensions and I have tested it on a clean Xorg server even without any window manager activated as well.

We have used the lesson 4 at http://nehe.gamedev.net/ and added some high performance counter for frame timing. The stuttering was the same using this demo application on Windows XP, Windows Vista and Windows 7. A colleague of mine told me that it was not stuttering on Windows 7 on the nVidia Quadro FX 5800 system. As I checked it yesterday I could clearly see the same issue. The stuttering happens not periodically and it is very hard to see during slow or fast motions. Most of the people I asked told me that the application was not stuttering at the first place and needed some time recognize the effects. It is much easier to see on a projection with the width of about 2 or 3 meters than on a small display.

May be calling it micro stutters (I found the term somewhere on the web) would be better. I is far better to see on linear movements. As we could see the issue on every system we used we have contacted nVidia to see what they say about this stuff. I will place a link to the modified nehe demo (Windows version only) as soon as possible. We changed the rotation against some linear translation as we figured out that it is very hard to see the stuttering on rotating objects.

As I told, we tried lots of systems. Our high performance rendering system contains the following hardware:
2x Xeon QUAD (4x3.2 GHz)
24 GB RAM (Tripple Channel)
nVidia Quadro FX 5800 (4 GB RAM + 3 GB Shared)

It uses the latest Fedora Core or Windows 7. I will look up the driver version the next days but it should be one of the 18?.?? versions. We do of course not use any MESA rendering.

We used glut before I changed it to SDL. As I told before the timer is not the problem. The start_timer(), stop_timer() methods I have commented out use in the source are methods using the CPU cycle counter for timing which is mostly the same as QueryPerformanceCounter in Windows does.

I will see if I can post a full system and driver configuration of our system but I do not think that this is a software or system configuration issue anymore.

SDL, GLUT no difference to me. Since I saw no issue on SDL code thought I would explore a new data point with GLUT :slight_smile:

I tried staring at your SDL code running again for a minute or two – got dizzy – still seeing no “micro stuttering”. I am only seeing a FPS of “fps: 59.952038”. What number are you getting when you see micro stuttering for FPS? Is it possible that I am just synced to my monitor refresh rate whereas you are not in your particular test setups?

ps just curious, if you change in animate()


angle = (double)ntime / 7000.0 * 360.0;
to
angle = (double)((Uint32)ntime % 7000) * 360. /  7000.0;

does this help?

No visible stuttering at all with the above posted code. And I find myself quite sensitive to this sort of things.

Do you have update code which would better demonstrate the problem ?