Speed: Linux vs. W2000

Originally posted by zen:
btw: marcus are you sure that SDL uses gettimeofday?If it does then it should give microsecond accuracy.At least it says so in the man page.

Uint32 SDL_GetTicks (void)
{
#ifdef USE_RDTSC 
	unsigned long long now;
	if ( ! cpu_mhz1000 ) {
		return 0; /* Shouldn't happen. BUG!! */
	}
	rdtsc(now);
	return (Uint32)((now-start)/cpu_mhz1000);
#else
	struct timeval now;
	Uint32 ticks;

	gettimeofday(&now, NULL);
	ticks=(now.tv_sec-start.tv_sec)*1000+(now.tv_usec-start.tv_usec)/1000;
	return(ticks);
#endif /* USE_RDTSC */
}

I think USE_RDTSC is not defined (i.e. RDTSC is not used), but I may be wrong.

In any case, you are right that gettimeofday() usually gives you microsecond resolution. Unfortunately SDL_GetTicks() truncates the resolution to milliseconds, regardless of the underlying timer resolution. That is why you can never get better than 1 ms resolution with SDL.