Timers - time.h or GLUT timerfunc? GLUT or SDL..?

Just a general what-do-you-think-because-you-probably-know-alot-more-about-this-than-me question…

If one is going for some kind of time realism in an animation (ie. a number of revolutions per second, a realistic velocity, etc.)… would you go with coding in your own time measurements using <time.h> and handle it that way, or would you more recommend something using a GLUT timer function (assuming one was even using GLUT!)

I guess I’ll even throw this in as the extra credit question: I’ve heard a lot of complaints about GLUT not being much good for “serious” work because of some shortcomings… is SDL really that much better? I’ve looked at some of the code for it but found it a bit more intimidating. I’m really big on code portability and I’m told it’s extremely portable though…

If you find SDL too much try glfw. It’s very lightweight and I find it a better API than GLUT. Use it’s timer since why code your own?

GLUT and SDL and glfw are all very portable. I think this is a question of preference – I like GLUT but SDL is a very good choice too. GLUT forces a coding style – which I find makes looking and debugging others GLUT code easier than SDL but others find it limiting for just that same reason. Try em all and find your favorite for your coding style – there all good.

As for timers, time.h doesn’t really have any advantage over GLUT timer – both are limited to millisecond resolution. I find I use the glut timer because it makes portability transparent. SDL_GetTicks() I believe is also just millisecond precision. If you want very high precision you have to start using CPU hardware like “RDTSC” assembly instruction – that is not cross-platform though. What kind of precision do you want in your timer?

Like I said I use GLUT – see Post265986 for an example of how I get around the millisecond precision timer problem.

Yeah I didn’t expect there would be much advantage over time.h vs. GLUT timer. I figured behind the scenes GLUT timer is probably just using some implementation of time.h so the odds of it being > or < were pretty slim. :slight_smile: It was just sort of a curiosity thing for me really… Milliseconds is probably good enough, I just wasn’t sure if one of them was any more efficient than the other.

I’ve never heard of glfw before. I’ll take a look at that too!

I don’t know. I’ll confess to really not being a big “expert” or anything, I just notice a few restrictions (like no ability to lock window size!) and such. I’m told that SDL has a lot of other multimedia abilities too… like more than just OpenGL, being able to add sound, etc. I haven’t seen that in GLUT although I’m certainly not going to say that means it isn’t there because… I’m still (fairly) new at this.

I doubt that SDL is any better or worse than GLUT here. Remember, these will not affect OGL’s performance. The advantage to GLUT is that it is designed specifically for GL and nothing else, whereas SDL has a broader range of priorities which is why it seems more awkward.

IMO you shouldn’t lean on the toolkit too much anyway so it is better to have a minimal one, such that you work with GL directly rather than SDL wrappers. In the end, for “serious” work, you would probably want to use none of the above and work directly with the windowing system, so the less dependent on the toolkit you become the better. Of course, when you are starting out you may want a bunch of SDL bells and whistles to make things easier, rather than having to do a lot of low level work. $0.02.

Vis, timing, the problem with glut and other timers is they will not account for time which expired between calls without some work. You cannot have a threaded timer which just calls your frame draw every 20 msecs or something.

What I’m doing right now involves time.h and glutTimer:


void tick (int frame) {
        struct ntptimeval now;
        static long secs, usecs;
        long dif;
        int msecs = 20;

        /* throttle frame rate, ideally to 50 */
        ntp_gettime(&now);
        
        if (frame) {
                if (now.time.tv_sec > secs) dif = 1000000-usecs+now.time.tv_usec;
                else dif = now.time.tv_usec - usecs;
                if (dif >= 20000) msecs = 0;
                else msecs = (int)(20000 - dif)/1000;
        }
        secs = now.time.tv_sec;
        usecs = now.time.tv_usec;
        
        if (!Pause) drawScene();
        glutTimerFunc(msecs,tick,++frame);
}

Beware that timers work on system ticks and are inexact. For example, that SHOULD throttle the frame rate to 50, but measured against the hardware clock, it can end up being 70-100.

This at least allows you some control. I don’t think there is a means of setting the frame rate exactly since the hardware clock operates in seconds (altho you could use it to set “msecs” every few seconds, rather than every frame – I have not tried).

Yeahhhhh… I kind of always got that impression but I hate the lack of code portability that comes from it you know? But I can see what you mean. Lot more efficient to cut out the middle toolkit work…

Vis, timing, the problem with glut and other timers is they will not account for time which expired between calls without some work. You cannot have a threaded timer which just calls your frame draw every 20 msecs or something.

What I’m doing right now involves time.h and glutTimer:

// code

Beware that timers work on system ticks and are inexact. For example, that SHOULD throttle the frame rate to 50, but measured against the hardware clock, it can end up being 70-100.

This at least allows you some control. I don’t think there is a means of setting the frame rate exactly since the hardware clock operates in seconds (altho you could use it to set “msecs” every few seconds, rather than every frame – I have not tried).

Ah yeah… I had actually meant tracking movement of something that should move in some sort of “realistic” time rather than frame rate, but that’s actually some really interesting coding and I’ve already copy pasted it to chew on it later :slight_smile: I’ve been looking at stuff like… if an object should revolve six times per second or something (completely arbitrary example), keeping track of time and whatnot to track how far it should have revolved. I’ve actually come up with what SEEMS to me like a decent method but I can see where it could run into trouble after a while… I just keep track of the start time, the current time, and the time frozen (not moving for whatever reason) so I can subtract that and that way if movement starts again it doesn’t jump ahead. Anyway, I’m doing it using clock()/CLOCKS_PER_SEC but I haven’t tried letting that run for an extended period of time, so I’m not sure how long that will be especially efficient…

I actually always use a toolkit myself :lol: but I’m also a beginner. Not trying to discourage you from using SDL, just to let you know that the apparent “lack of functionality” in GLUT is intended to be a feature and not indicative of some flaw. If all you want to do is use openGL, I found it more straight forward, there is less to mess with. But SDL is a pretty great thing.

I had actually meant tracking movement of something that should move in some sort of “realistic” time rather than frame rate

I guess it depends how close to “real time” you need to be. Like I said, the only way you will get true, accurate seconds is to use the hardware clock – time() – which only operates in seconds, not fractions there of. It is totally independent of the CPU.

I think the CLOCKS_PER_SEC variable will be prone to the same discrepancies I described with the “hi-res timer”, which does rely on the CPU. I know with linux, and probably all other OS’s as well, this is set by the kernel very early in the boot process after a test. Or else it is just a predefined value. The issue is that the actual number of ticks can vary quite a bit.

Yeah, I’ve heard about some of those problems. Time in programming = frustrating. Timeval would be useful but I’ve heard the useconds count can be off by something like >10000…