gl_luminance_alpha texture distorts other textures

example: two textures

first one:
glTexImage2D(GL_TEXTURE_2D,0,2,w,h,0,GL_LUMINANCE_ALPHA,GL_UNSIGNED_BYTE,img);

the second one:
glTexImage2D(GL_TEXTURE_2D,0,3,w,h,0,GL_RGB, GL_UNSIGNED_BYTE,img);

Displaying only the second one works fine, but as soon as i display the first one (with the luminance) the second texture gets distorted. Not showing the first one anymore does not solve the issue. The texture itself seems to be screwed somehow. The luminance texture itself does show correcly (alpha as well)!

When i load the first img using GL_R instead of GL_LUMINANCE_ALPHA the second texture stays intact (although in this case of course the luminance texture does not show correctly, but that’s a given).

I also tried GL_LUMINANCE and it has the same issue which suggests to me it’s something with the luminance and not the alpha channel.

It can still be some memory issue, but i do not see how displaying a texture can mess up memory.

The drawing loop looks like:

glClear(GL_COLOR_BUFFER_BIT);

glEnable(GL_BLEND);
glEnable(GL_TEXTURE_2D);

glColor3f(1.0,1.0,1.0);
glBindTexture(GL_TEXTURE_2D, rgbTexture);
/* quad /
glBindTexture(GL_TEXTURE_2D, luminanceTexture);
/
quad */

EDIT: binding to the texture already screws it upfor the rgbtex on the next pass, don’t even have to display a quad for the second texture.

After a snack lunch, some lightning bolts in the distance, heavy rain, hail and bright sunshine (all in the last 20 minutes) i found the -as to be expected- stupid problem.

need to texture parameter GL_TEXTURE_WRAP_S/T to GL_CLAMP for the non-luminance texture. Still some strange behaviour (seeing that i could perfecly recreate it with that format specifier) but i got it fixed.

grats