performance issues, do they ever end?

ok, so I have a prog that displays data from a device in real time. the data comes in from the device around 50 times per second, through ethernet, this data is taken care of by a separate thread that is spawned from my init function…my problem in lies here, i have the thread and the display func synconized, but, my idle func is also my display func, and well, it is being called many many more times a second than it needs to be, is there any way to tell the mainloop to sleep, cause it is killin my frames per second, literally, for every read that comes in from the device, im getting like 25 calls to my display func, and i only need one…any help would be awesome, just ask if that didnt make any since, lates

Just don’t set the idle func equal to the display func.

The idle func is exacly what it says on the package. It’s called when the processor is idle. So you’re getting exactly what you’re asking for, you’re just not asking for the right thing :wink: .

The only solution to this is to set the idle function to NULL. But then of course you’ll have to find some way to tell the display thread when to render a new frame.

I’m not sure if it’s possible with glut, but you could try calling glutPostRedisplay in your data generation thread to wake up the render thread. Theoretically it should work, but I’m not 100% sure if glut is thread-safe. If it is not, then you’ll have to use another windowing library.

yeah, glutmainloop takes over all gl commands, issueing glutPostRedisplay in the data thread actually doesnt do anything, i mean, the program registers it, and doesnt give an error, but the glutMainLoop doesnt actually know about any events (ie glutPostRedisplay) that are issued outside of its thread…QT had a way to fix this by allowing the data thread to issue an event to the main event queue, i wonder if there is any way to do this…maybe ill just port this gl prog into qt.

thanks