Texture Flicker in the Distance

I have created a small scene like the one in the attached sketch (the red thing is where the camera is, so basically I’ve drawn the scene sideways so you can make this clearer). I have a textured quad for the ground and, in the back, another textured quad for the sky. I give the impression of the sky moving by changing the texture coordinates each frame. After that, I’ve added several quads (the purple ones) textured as trees. The textures are TGAs with alpha masks so that I can have transparecy for my 2D trees.
I have two issues when rendering this scene:

  1. The first is that every time my sky texture moves, I flickers.
  2. The second is that whenever I move the scene left or right, the trees flicker badly, especially the ones in the distance. The farther away they are, the worse the flickering.

Here are the parameters I use when creating my textures:

When adding anisotropy I managed to fix the flickering on the sky, but it did nothing for the trees. Also, I tried getting the largest supported anisotropy and I got 2.0, which doesn’t solve the problem, but I’ve hardcoded 8.0 and it works.

I’ve tried finding solutions for this on the net but I’m really stuck and I would appreciate it if somebody has any suggestion.

You need mipmapping to resolve this.

Thanks for the quick answer. I’ve tried replacing
and my textures become white. I’ll check into this, but should this be enough or do I need to generate the mipmap too?

I think (I am a beginner too) that you have to call glGenerateMipmap to compute the mipmap.

I deliberately left out information on how to generate the mipmap chain precisely because there are so many different ways of doing so, but in a nutshell - yes, you do need to specify the submip levels too. In order to help more, you would need to give us some info on which OpenGL version(s) you are targetting. glGenerateMipmap may or may not be available, you may or may not have GL_SGIS_generate_mipmap, glTexStorage may or may not be available, you may or may not be able to use dear old gluBuild2DMipmaps, if it’s a non-power-of-two texture things might get complicated, or you may even wish to generate the submips manually in code.

I managed to pull it off with GL_GENERATE_MIPMAP in texparameter, because I’m testing on an old computer, so I sticked to the old opengl version in the microsoft sdk. This does now move much smoother, but I now have an issue with some of the farther textures as they seem to have some issues with the alpha masks; generating mipmap for a texture seems to have ruined the alpha masks for them when they are drawn farther away.

Also, on my old machine, the application crashed when trying to use non power of two textures. Didn’t have this issue on the newer one, though.

You really need to use pre-multiplied alpha when using alpha in conjunction with filtering (either bilinear interpolation or mipmaps or both). This means that the texture values must be (ar,ag,a*b,a) rather than (r,g,b,a), and blending should use glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA). Some texture loading libraries have the option to return pre-multiplied data, otherwise you need to do it yourself prior to passing the data to glTexImage2D() or glTexSubImage2D()

Non-power-of-two textures require OpenGL 2.0 or later or the ARB_texture_non_power_of_two extension.

Note that gluBuild2DMipmaps() doesn’t require texture sizes to be a power of two; the source data will automatically be scaled up or down to the nearest power of two if it isn’t already. Similarly, the source data will automatically be scaled down if it exceeds the implementation’s maximum texture size. The main drawback is that you don’t know in advance what size the texture will be, which can be a problem if you need to replace portions of it with glTexSubImage2D().

Well, I’ve been using photoshop to save my TGAs ( and using glBlendFunc with GL_SRC_ALPHA) , and after some search on their forum I ran into this, posted by one of the staff members:

Photoshop can’t premultiply when saving as TGA, because we don’t support transparency in TGA, just an alpha channel.

So that means that I should think about switching over to a new format, or use this inelegant alternative : http://www.gamedev.net/topic/533676-mipmapping-and-transparency/

It’s not necessary for the image format to support pre-multiplied alpha directly. You can just convert the data after loading it.

Thanks, I’ve changed the function for reading the TGA so that when it converts them from BGR to RGB it also multiplies each value by the alpha value (divided by 255.0f of course) and it worked like a charm. It now looks really good on the newer computer but the older one make the things that are not that far away from the screen pixelated. I’ve tried using glHint( GL_GENERATE_MIPMAP_HINT, GL_NICEST);
but they don’t seem to have any effect on any of the computers.

That would occur if the magnification filter is nearest rather than linear. Mipmaps are only used for minification, i.e. when each base-level texel maps to a screen area smaller than a pixel (i.e. when the texture scale factor is less than one).

Be sure to use glGetError() to check that none of the functions are failing (e.g. due to using newer features on systems which don’t support them).

None of the filters uses nearest, they’re all set to linear. I’ll check and see if any of the functions fails. Thanks everybody for the tips, this helped a lot. My second question, the one about the flickering and the anisotropic filter got overlooked, so I don’t know if I should ask this here or start a new thread, but my only question was why did using GL_MAX_TEXTURE_MAX_ANISOTROPY_EXT return a largest supported anisotropy of 2.0f, which still caused flickering, but if I manually set it to 4 (or 8,or 16) the flickering stops.