Problem with lagging animation over time

I have simple scene with GLUT sphere moving from side to side, but few moments after the scene starts, movement turns laggy and FPS drops from standart 2000 to 500 every second.

Do you have any idea why?

This is the code:


#include <glut.h>

int width = 1280;
int height = 800;

bool direction = true;
GLfloat sphereX = 0.0f;

void onInit(void) {
	glClearColor(0.0f, 0.0f, 0.0f, 0.0f);
	glClearDepth(1.0f);
	glEnable(GL_DEPTH_TEST);
	glDepthFunc(GL_LESS);
}

void onResize(int w, int h) {
	glViewport(0, 0, w, h);
	width = w;
	height = h;
}

void onDisplay(void) {
	glutPostRedisplay();
	glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

	glMatrixMode(GL_PROJECTION);
	glLoadIdentity();
	gluPerspective(45.0f, (double)width / (double)height, 0.1f, 500.f);

	glMatrixMode(GL_MODELVIEW);
	glLoadIdentity();
	gluLookAt(0.0f, -65.0f, 80.0f, 0.0f, 0.0f, 0.0f, 0.0f, 0.0f, 1.0f);

	if (direction && sphereX >= 57.7f) direction = false;
	else if (!direction && sphereX <= -57.7f) direction = true;

	if (direction) sphereX += (GLfloat) 57.7 * 2 / 5 / 1300;
	else sphereX -= (GLfloat) 57.7 * 2 / 5 / 1300;

	glTranslatef(sphereX, 0.0f, 1.2f);
	glColor3f(1.0f, 1.0f, 1.0f);
	glutSolidSphere(1.2f, 50, 50);

	glFlush();
}

int main(int argc, char **argv) {
	glutInit(&argc, argv);
	glutInitDisplayMode(GLUT_RGB | GLUT_DEPTH);
	glutInitWindowPosition(185, 125);
	glutInitWindowSize(width, height);
	glutCreateWindow("Test");
	glutDisplayFunc(onDisplay);
	glutReshapeFunc(onResize);
	onInit();
	glutMainLoop();
	return 0;
}

First, FPS is a rotten benchmark to profile against. Use ms/frame – it’s linear and allows for relative comparison. Converting your numbers, we have 0.5ms and 2ms per frame.

Second, swap the glFlush at the end of onDisplay() with a glFinish(). Otherwise you’re timing doesn’t have anything to do with “rendering” frames – just “queuing” frames. Driver will see a glFlush() and go merely on queuing for the next frame, and you’ll block at completely random points in the queuing of a frame’s GL commands.

[QUOTE=Dark Photon;1252087]… swap the glFlush at the end of onDisplay() with a glFinish(). Otherwise you’re timing doesn’t have anything to do with “rendering” frames – just “queuing” frames. Driver will see a glFlush() and go merely on queuing for the next frame, and you’ll block at completely random points in the queuing of a frame’s GL commands.[/QUOTE] Isn’t it recommended to use an Idle Function to do animation as opposed to putting a redisplay in the display routine?

First, FPS is a rotten benchmark to profile against. Use ms/frame

Why is FPS usage bad? If i get this right, the 0.5ms is just another expression of 2000 FPS.

But thanks for reminding me the glFlush and glFinish difference, this should be the problem. Did not test it yet.

Isn’t it recommended to use an Idle Function to do animation as opposed to putting a redisplay in the display routine?

Could you please show me, how would you use glutPostRedisplay properly in my code?

Here’s one blog post on it (see comments on it in 2nd paragraph and then 5th paragraph through the end):

[QUOTE=Triffid;1252149Could you please show me, how would you use glutPostRedisplay properly in my code?[/QUOTE]

I posted an example of how to do animation using an Idle Function in the thread ‘problem with 2 rotating cubes’. Not sure if this would speed up your code, but it should at least give you constant screen update rates.

If I see the point of your technique, the redisplay process must be outside of the display function, handled by keyboard events or idle functions. But how should i define the idle function, when I need to run the scene idependently on the input?

[QUOTE=Triffid;1252173]… how should i define the idle function, when I need to run the scene independently on the input?[/QUOTE] Not sure what you are asking. In the example I posted I show how to define the idle function using keyboard input. I also show how to undefine it (with ‘NULL’) parameter. You don’t explicitly call an idle function. All you have to do is define it (or undefine it). Once it is defined, OpenGL automatically calls it after the display routine is called. Basically you are in an infinite loop until the idle function is undefined (Nulled). Your idle function would be the 4 lines of code called in your display routine which deal with the variables ‘direction’ and ‘spherex’. Take those out of the display function and put them into the idle function. Once again see my example for details.

I was asking exactly what you are answering. I understand the basic idea now, I should study some more game loop princips to get more complex overview.

Thanks to both of you.

Well, I must be missing something, because even if I try to remove glFlush, and move glutPostRedisplay with position calculations to idle function, I keep getting big performance drops after a while.

I’d advise that first of all you get rid of that single-buffered context. I’d honestly love to see all GLUT tutorials that create a single-buffered context being consumed by a very large, very hot fire. You’d almost think that using double-buffered was in some way an advanced topic, but it’s not - it’s two lines of code.

Change this:

glutInitDisplayMode (GLUT_RGB | GLUT_DEPTH);

To this:

glutInitDisplayMode (GLUT_RGB | GLUT_DEPTH | GLUT_DOUBLE);

And at the end of your OnDisplay function, instead of glFlush or glFinish, use this:

glutSwapBuffers ();

Now you’ve got a nice double-buffered context that will work better with the way your GPU is designed to work, so try things out and see what happens from there.

Now you’ve got a nice double-buffered context that will work better with the way your GPU is designed to work, so try things out and see what happens from there.

Unfortunately, frame buffering is not the problem. I had been using it at first, but i thought it can be the cause, so i removed it. But the performance problem remains with enabled or disabled double buffering.

Did anyone try to compile my example code to ensure they experience the same problem?