Hi, I have a small game which uses tiles. The program draws images fine when in windowed mode and the images do not need to be stretched. However, when I make my program go fullscreen, and I change the size of my viewport, the tiles being rendered seem to flicker sometimes in the edges between tiles. I have attached an image where there is a large horizontal line between some of the tiles. I have tried a lot of things to fix this. I have updated my shaders to use view and projection matrixes. I have added a 0.1 value that my view matrix is translated by (This seemed to stop a lot of flickering but not all of it) and have tried messing with my texture settings. Looking across the internet, one solution that is mentioned is to put a 1 pixel gap between images on my tile sheet. To me this seems like kind of a hassle as I would need to re-edit a lot of texture sheets and would prefer finding a solution that did not involve hand editing my texture sheets. Is there anything else I can try? or do I just need to bite the bullet on this one and open photoshop.
I think you first want to determine if the flickering is caused by the tile geometry not being correctly aligned which would suggest the math to position the tiles is perhaps off for your fullscreen resolution.
Or if the flickering comes from the textures sampling from adjacent tiles in your texture atlas. First thing I’d suggest to check is if the UV coordinates of your tile vertices are placed at texel centers. For a texture of size W x H the normalized UVs should use values (u + 0.5) / W for u in [0, W-1] and (v + 0.5) / H for v in [0, H-1].
For mipmaps you’d have to ensure that the mip chain ends at the level where your tiles are 1x1 texel in size.
Hi, thanks for the response! I quickly tested out the change where I add 0.5 to my uv values before normalizing. I will play around with that more when I get the chance, but at first glance it seems to make the lines appearing between the textures more apparent. I attached another screenshot below that shows the tile bleeding with the updated texture coordinated. I do believe that the issue is that textures are being sampled from adjacent textures in my texture sheet. My tile sheet has a lot of black textures so I do not think it is a gap in the tiles.
Also maybe my knowledge on mipmaps is somewhat lacking as I do not believe I am using mipmaps. I thought mipmaps were just for scaling textures down. I will have to look into it.
Thanks for the help. I will play around with the texture coordinates more tonight when I get the chance.
Ok. If you want to be certain an easy way to verify this could be to change the background color to something bright (so it would be visible through gaps in the tile geometry) or switch to wireframe drawing (glPolygonMode
) with a bright solid color so that you can see the edges.
They are used when the texture is minified (applied to a smaller screenspace area than its regular size), but that still happen for tiles (e.g. your tile is 256x256 texels and it is applied to a screen area that is smaller than that). But if you create your textures without mipmaps then you don’t have to worry about how to scale down your tiles and ensuring that does not mix texels from different tiles.
Hi, I changed my background color to a pink color. It looks like in some places, it is texture bleeding and in some places there are gaps between the tiles. This is weird to me. Maybe I need to adjust my vertices some what. The first image I attached is without the +0.5 change to the uvs and the second image is with the +0.5 change to the uvs. It does look like texture bleeding still is occurring with the +0.5 modifier.
As for mipmapping. I don’t believe it is relevant as I am not shrinking my textures. My tiles are 100 by 100 pixels and my starting window and view port is 1000 by 750. There is no tile artifacts when in this windowed mode. When I go fullscreen and change the viewport size and that is when the tile tearing starts. I also make sure the height and width of the view port is an integer. I basically just want the same image I have when the program in windowed, stretched across the screen when the application is in fullscreen mode.
I think I might have fixed the issue. When I calculate the max x or y value of a uv before normalizing, I subtract 0.5 from it. When I calculate the min x or y uv value I add 0.5 to it before normalizing. After that, the artifacting seemed to go away (at least for now). I am not sure why this would be the case. Is it because I am forcing the textures to essentially be a pixel shorter so that if opengl grabs an extra pixel when rending it would still be the correct pixel?