i have a problem with moving textures in OpenGL (using LWJGL).
In my app the character moves on the ground to the next tile.
the distance for X is computed like this:
float movex = 100f * time; //where time is the time per frame
When i move like this, the whole tile map start to flicker (in particular textures with a black border).
This happens because the time is not constant every frame (time = 0.01 frame 1, time = 0.0003f frame 2, …)
If i move like this:
float movex = 100f; //no time is taken
the world and the char move smoothly to the next tile.
My question is:
How do I have to compute the time that:
[ul][]the drawn tiles do not start flicker/shake[]the time to run from tile a to tile b is equal on every pc on this world[/ul]
i now update the time per frame only every second.
the flickering does not stop
i think it has something to do with the float value i compute as distance.
could it be that the decimal of the float can messup the tile map?
is there any option i can set to tell opengl “not to be so hard” on the float glVertex information?
Flickering ? Should (almost) never happen with both double-buffering and vsync.
What is the granularity of your time measurment ? Most of the milliseconds value are in fact only updated less than hundred times per second. You can very by just doing a tight loop of println(time) to have an idea.
I made some tests and found out, when i move (only think in x direction) from 0.0 to 0.50 in x direction then i get this picture:
if i move 0.01 further (= 0.51)
then i get this one:
as you can see, in the second picture there are parts of a black border on the fence.
If i move 0.01px more to the right (0.52) i see it as it is on the first picture! It happens on every .5 px pos…
Using GL_LINEAR filtering will help a bit smoothing the moving texture.
But the only correct way for pixel art is to always move on integer coordinates.