# Rotation Speed

I have that problem when I’m changing a computer my program doesn’t rotating the items in the same speed as mine pc… and in worst case doesn’t rotate almost nothing or very very slow.

How can I see the Rotation Speed it’s computer can support (nin-max values - usign GLinfo can I see such thing?)?

Can anyone help me?

The reason why your rotations look different is because your program is being run at different speeds. The faster the computer, the faster the rotation, am I right?

The way to fix this is to let the frames per second (FPS) influence the time-step when you integrate your angular velocity. Or in English:

I guess you have the variables rotationSpeed, currentRotation. Each frame you then compute (this is the integration part):

currentRotation += rotationSpeed*dt

where dt is the time-step that I was talking about earlier. The trick here is to set:

assert( FPS > 0 );
dt = 1/FPS
currentRotation += rotationSpeed*dt

which means that your rotation speed will be the same in “world time” on all machines.

Ok I will try to do what your saying but…

The other Pc has much better graphic card than mine Pc and it doesn’t rotate at all… it’s like is trying to do rotation but spasmolytically…

Basically, you have to make the rotation a function of (real) time. So, you run a timer in your program, and update the position of the object based on the timer value. This way, positione becomes independent of the fps.

Example: let us say, you want an obecjt to perform a full rotation in 30 seconds. In pseudocode:

``````
rotation =+ (timer.timeElapsed()/30)*360;

``````

For you problem with another PC… It may be that the other card is so fast, that you don’t actually see any rotation. Say, if you are rotationg 5 degrees per frame, and it renders with 1000 fps, your eye (which has a resolution of about 30-50 fps) won’t see a smooth rotation. Add the possibility of disabled Vsync and tearing… and you have the “spasms” you describe

1 Like

Perhaps it’s actually doing an entire revolution of rotation so fast that you only catch it every once in a while?

How can I include the class ‘timer’ in opengl?

Check out the glutTimerFunc. Some info about it:

http://www.opengl.org/resources/faq/technical/glut.htm