Texture Flicker (Orthographic 2D)

I’ve looked around for a while for a solution to this problem, and to no avail. The problem is that I am drawing tiles of 64.0f by 64.0f, with tile sheets of 128.0f by 128.0f. The size of the tile is divided by the size of the tilesheet to get a texture coordinate (based on the tile it needs to draw).

I have checked these values, and they are 0.0f, 0.5f and 1.0f which are correct coordinates for the texture drawing. When stationary, the tiles draw fine. When the camera is moving however, I run into a problem and the textures flicker every now and again. It happens very fast but I was able to get a screenshot of one of the flickers occuring.

I have drawn fat red arrows pointing to the problem. What confuses me is that this problem only occurs on a single row of tiles at a time when it happens, and it only happens while the camera is moving.

I have read places that setting the projection right and bottom 0.5f less, which is used in the screenshot, and obviously doesn’t solve the problem.

Does anyone know what is happening here? I would post some code but I honestly don’t know where the problem is occuring, so request anything needed.

Are you double-buffering and v-syncing?

I am double-buffering. I have not explicitally enabled v-sync, though the program will not run over 60fps so I assumed it was already v-synced.

The line does tend to appear on the same place of the window as each tile passes that point. The grass tiles appear to be taking a pixel too many from the sprite sheet (65 pixels instead of 64, though the coordinate is always 0.5f).

Could it have something to do with the camera coordinates not being integral? The tiles do not move from their vertex coordinates but the camera can end up at a coordinate such as 3.8f for example with the way I have set it up. I don’t see this being the problem however, it’s just some additional information about what my application is doing.

It’s hard to tell from your screenshot if this is happening at the same line on the physical display, or at some arbitrary point on the rendered surface.

If it’s the former it seems like a HW problem to me.
i.e. Some kind of tearing or something.

If it’s the latter then it may be maths related as you say, or could it be some kind of z-fighting? When you draw these texture tiles are they on one QUAD or lots of QUADS, and how are you generating their geometry?

The line is happening at the same place. On the screenshot, its a few tiles below the stick figure, and flickers at that location each time the bottom of the tile passes that location (though, it does occur at other places, but again, those places seem to be the same places each time, such as a few tiles above the stick figure also).

The tiles are drawn using a vertex array, which are drawn as triangle strips

glTexCoordPointer(2, GL_FLOAT, 0, textureCoordData);
glVertexPointer(3, GL_FLOAT, 0, vertexData);
glColor4f(1.0f, 1.0f, 1.0f, getAlpha());
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);

All the tiles are drawn with a Z value of 0.0f at this time, but they are always drawn in the same order each frame. The texture coordinate is obtained using a coordinate step, which denotes how big each tile is for a texture coordinate (i.e. if the texture is 128.0f and the tile is 64.0f, each tile has a 0.5f texture coordinate increase)

xCoordStep = getWidth() / textures[getTexture()].getWidth();
yCoordStep = getHeight() / textures[getTexture()].getHeight();

I do think this method could be problematic in the future, but at the moment it always obtains the values of 0.0f, 0.5f and 1.0f.

The vertex data for the tiles never changes. The vertecies for each tile touches each other (i.e. one ends at 64.0f, and the next starts at 64.0f).

However, this problem only occurs when I’m moving the camera. As mentioned, the camera can be located at a non-integral position such as 3.8f. But from what I see, the tile is for some reason taking an extra pixel for the texture even though the coordinate is always at 0.5f, and the line always occurs at the same place in the window as stated in the first paragraph. Here is a screenshot of the tilesheet with an arrow pointing to the row of pixels it seems to be taking.

I’ve seen similar things to this on my landscape code and used a kind of overdraw to get around it… Basically it was because some of the vertices were not matching up as I expected. So I was coming from that perspective with a couple of my suggestions.

I can’t be much more help really other than to suggest toggling depth buffer writing and also depth testing (if you have it on), but that’s just to eliminate weird z stuff as a possibility.

How is the “stick guy” drawn? It’s seems to be a bit significant that it’s happening above and below him…

I appreciate your help.

The stick figure is drawn in exactly the same way. Everything drawn on the screen derives from a sprite class, which draws using the vertex buffer as previously mentioned.

I have been playing about with the projection (since for some reason the tiles weren’t drawn at full size), and the problem is consistant now. It was the texture being off by one coordinate, but only when moving the camera. Every tile now has one pixel extra.

When the camera is not moving however, the texture renders fine.

So I have pretty much concluded the camera has something to do with it, but I have tried everything and it seems that by just moving the camera, the texture coordinates don’t work. Has this been a problem for you?

I am by no means an expert on the older FF aspects of OpenGL, having spent a lot of my time with GL working with shaders, but just thinking out-loud perhaps this has something to do with why older texture implementations had the ability to have 1 pixel borders / skirts…

Maybe someone else with a bit more experience with that could shed some light?

Could you show us how you setup tile textures? Do you use linear filtering? What wrapping mode do you use?

Maybe change texture clamp mode to GL_CLAMP_TO_EDGE ?
Make sure you don’t have antialiasing enabled.
Try to zoom in, maybe artifacts will be easier to see.

What is you video card/OS/driver version ?

Many thanks for your help guys.

The function for my texture calculates the coordinates based on a frame given. Each tile has an ID, and it loops from 0 to that ID and then generates the coordinates. The ID setup for the tilesheet posted above looks like this.

0 1
2 3

Here is the code that generates and sets the texture coordinates.

void engSprite::setFrame(GLint val)
	itsFrame = val;
	// Get the relative texture coordinate per tile
	static GLfloat xCoordStep, yCoordStep;
	xCoordStep = getWidth() / textures[getTexture()].getWidth();
	yCoordStep = getHeight() / textures[getTexture()].getHeight();

	// Get the texture coordinates based on frame
	static GLfloat textureX, textureY;
	textureX = 0.0f;
	textureY = 0.0f;
	for(GLint i = 0; i < getFrame(); i++)
		textureX += xCoordStep;
		// 0.98f is used instead of 1.0f incase the division rounds in an unusual way
		if(textureX >= 0.98f)
			// End of the row, move down to the next row
			textureX = 0.0f;
			textureY += yCoordStep;
	// The texture is flipped vertically since the loaded texture is upside down
	// Bottom Left
	textureCoordData[0][0] = textureX + 0.01f;
	textureCoordData[0][1] = textureY + yCoordStep - 0.01f;

	// Top Left
	textureCoordData[1][0] = textureX + 0.01f;
	textureCoordData[1][1] = textureY + 0.01f;

	// Bottom Right
	textureCoordData[2][0] = textureX + xCoordStep - 0.01f;
	textureCoordData[2][1] = textureY + yCoordStep - 0.01f;
	// Top Right
	textureCoordData[3][0] = textureX + xCoordStep - 0.01f;
	textureCoordData[3][1] = textureY + 0.01f;

First of all, notice the +0.01f and -0.01f. Prior to posting this, I added this offset around the edge of the texture, and they now draw nicely. I wanted to avoid doing this, but I could live with calculating how big 1 pixel is as a texture coordinate and removing it from every side.

Now I have tried several things that didn’t work, as you all suggested (thanks again for the time to make the suggestions):

  • The clamping did nothing. The problem is occuring in the centre of the texture therefore the clamp won’t change anything

  • I flipped the texture as commented, but even when drawing it upside down (the way it is loaded) the problem still occurs.

  • I messed around with the 0.98f (made it 1.0f, 0.99f etc.) to no avail.

  • I also tried making the variables not static, but since they are assigned at the start of each function call, this can’t have been the problem, but I tried anyway since it has caused a problem for me in the past.

Now as mentioned in previous posts, the texture coordinates work fine when the camera (I have been calling it a camera, but it is of course translating the world in a negative direction to which I want the camera to move) is still, but when the camera moves I get that 1 pixel offset on my texture coordinates (though they are all still 0.0f, 0.5f, 1.0f).

I really can’t understand how moving the camera can have this effect on textures. Maybe there is something missing in my texture coordinate generating function that you can see now.

I have an NVidia 8800 GT (Driver updated last night since that was brought up by scratt) and Windows Vista 32-bit. Anti-Aliasing is not enabled, and I have tried explicitally disabling it to no avail. I have set both magnification and minification filtering to both GL_LINEAR an GL_NEAREST, but neither changed anything.

Thank you again guys, I appreciate your time.

zoom up on the artifacts, to see them better.