I have a problem correctly mapping lightmaps. For my software renderer it works perfectly but I’m having problems finding the right mapping under OpenGL. The problem has to do with borders visible around the lightmaps. First note that currently my lightmaps are all packed in a larger ‘super-lightmap’ but this problem was also there before I did this optimization.
I will explain with an example. Assume I have a rectangular polygon which is texture mapped from 0,0 to 1,1 (in uv space). I also have an associated lightmap of 16x16 (just an example) size. This lightmap is part of a bigger super-lightmap (usually 256x256 texels). Every lumel in the lightmap defines the lighting conditions at a corner of a texel block (of usually 16x16 texels). So with a lightmap of 16x16 we actually have 15x15 blocks of 16x16 texels in the texture. And every 16x16 texel block should be colored according to the neighbouring lumels (lightmap values).
The correct mapping to do this would be (ignoring offsets and scaling for super-lightmap at this time):
lm_scale_u = (lm_width-1) / (high_u-low_u);
lm_scale_v = (lm_height-1) / (high_v-low_v);
In the example above this would be:
lm_scale_u = 15 / 1;
lm_scale_v = 15 / 1;
I can see that this is the correct formula because the lightmaps of neighbouring polygons are correctly aligned even if they have different size. HOWEVER! There are borders visible around the lightmaps and they are really ugly. I assume that they are caused by OpenGL sampling pixels from outside the lightmap boundaries. The texture environment for the super-lightmap is set as follows:
glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT);
glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT);
glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
Any idea how I can solve this problem?
Greetings and thanks in advance,
–
Jorrit Tyberghein: Project Manager of Crystal Space: Open Source 3D Engine http://crystal.linuxgames.com