Here’s my problem:
Every time when program renders view I use following code, so it runs as fast as in every computer:
When object moves I multiply the movement by float Move(see above) example:
Although this works it doesn’t work well. Is this right way doing it or should I just simply lock frame rate (it’s stupid though)?
You should look into QueryPerformanceTimer… that is more exact. I don’t know if your code is correct though.
Well I don’t know is my correct, but it still works (I have tested it). Could you give me example of that QueryPerformanceTimer I found only QueryPerformanceCounter from MSDN, but I think it has something to do whit it.
Yeah! QueryPerformanceCounter… Sorry for that…
Ok define your movement in units per second
So object1 will travel at 20meters per second, and object2 will travel at 50meters per second.
start_time = GetTickCount();
//do your app processing
//and update your position your position
time_taken = GetTickCount() - start_time;
//Now you have to scale time_taken to a value in seconds.
scalar = time_taken / 1000;
So this should do it?? Is there any errors in the code (I count movement in draw function)??. It still doesn’t run smoothly, if example I have winamp playing some music, fps is around 300, but there are still lots of times when object “jumps” (doesn’ move smoothly) what could be problem??.
void Render( void (*DrawFunc)(void) )
static DWORD sumtick=0;
static int frames=0;
tick = GetTickCount();
SwapBuffers( hdc );
time = GetTickCount()-tick;
Move = time/10.0f+0.00001f;
if( sumtick >= 1000 )
Fps = frames;
sumtick = 0;
frames = 0;
I get jumps sometimes too with a similar function. I think it may just be due to context switching of the OS. Do you get the same problems running your app on a different system? At the moment I am running on my old processor (I burnt my new one ) but when I ran my new one (T-Bird 1200) it didn’t jump at all.
You might already have thought about it but anyway…
You are calculating the object movement speed for the frame to be drawn based on the time it took to draw the previous frame.
So… If the frame to be drawn is going to have allot of more triangles than the previous frame it will take longer to draw. And therefore the object movement speed is really going to be too slow for the current frame.
Unfortunatly I haven’t come up with a solution. How are you going to know how long it will take to draw a frame ahead of time?.
It isn’t that noticable though, especially if vsync is off.
So maybe its not this you are experiencing.
You could try to set your process priority to realtime. That have fixed “jumping” on some computers I’ve come acrossed. Beware though, that other processes becomes somewhat disabled… hehe
Sorry can’t remember the win32 api calls for setting priority right now… SetPrioritysomething… I think.
Ah, its SetPriorityClass(). Pass in processhandle and REALTIME_PRIORITY_CLASS.
But it’ll probably kill pretty much all but your app. And I remember that alt-tabbing becomed cumbersome. (I detected when user pressed alt, then set normal priority, switched back to realtime when app come in focus again…)
Look my 100th post!.
One thing I notice in your last post is that ‘time’ is a local variable. Thus you are only calculating your FrameTime, not your FrameTime + InterFrameTime. With WinAmp in the background, the InterFrameTime may include disk accesses upwards of 50ms. You calculate the entire Main loop time, as such:
time = GetTickCount();
delta = (GetTickCount() - time) / 1000;
time = GetTickCount();
// Now, time is calculated across the entire
// Main loop, not just the frame time.
DrawFrame( delta )
Hope that helps. You will still get ‘jumps’ if you’re running stuff in the background. There is no way to avoid this in a non real-time OS (raising the priority of your process does not make it ‘real-time’, but it may smooth out the bumps a little).
Thanks for everyone, this helped me a lot, but unfortunately I’m in the starting position, but at least I know that code should be OK. Thank you!