glEnable (GL_BLEND) problem with mipmap

Hi,
i’m a new opengl programmer and i need some help about glEnable (GL_BLEND) and mipmaps.
My problem come up when i try to blend 2 Texture with mipmaps with extension glEnable (GL_BLEND), i can’t see in every mipmaps level the blend but only in one level.
So when the image is little i can’t see anything when it grow i can’t see it.
Solution to solve the problem??
P.S. when i don’t use mipmaps with the same code the problem doesn’t appear.
Thank you so much for any reply!

Sounds like your texture MIPmaps are probably not populated correctly, but really need more information to be sure.

Try using this:

  glTexParameteri( gl_target, GL_TEXTURE_BASE_LEVEL, 0 );
  glTexParameteri( gl_target, GL_TEXTURE_MAX_LEVEL , 0 );

to clamp the MIPmap level that can be sampled by the GPU for that texture to only level 0. See how that looks. If it looks good, bump it up to 1 to look at the next smaller MIPmap level, then 2 …, etc. For all MIPmap levels. See how things look. If one or more look wrong, then you’ve found a problem.

And what blending function are you using (glBlendFunc). Note that the default is GL_ONE / GL_ZERO, which basically doesn’t do blending. You’ve probably got this set to something as you say you’re seeing blending in some circumstances.

Also, may or may not be your problem yet, but it’s gonna hit you at some point: Pre-multiplied Alpha. i.e. use a ONE, ONE_MINUS_SRC_ALPHA blend function.

I build mipmap with this:


glEnable(GL_TEXTURE_2D);
glGenTextures (1, &texture);
glBindTexture (GL_TEXTURE_2D, texture);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR_MIPMAP_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_BASE_LEVEL, 0 );
 glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAX_LEVEL , 0 );
glTexParameteri(GL_TEXTURE_2D, GL_GENERATE_MIPMAP, GL_TRUE);

in this way all work correctly but if i change :


glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_BASE_LEVEL, 0 );
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAX_LEVEL , 0 );

in:


glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_BASE_LEVEL, 0 );
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAX_LEVEL , 1 );

the problem appear …

i’ve introduced also glBlendFuncSeparate (GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA, GL_ONE, GL_ONE_MINUS_SRC_ALPHA) in:


glEnable (GL_BLEND);
glEnable (GL_TEXTURE_2D);
glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE);
glHint (GL_LINE_SMOOTH_HINT, GL_NICEST);
glEnable (GL_LINE_SMOOTH);
glPolygonMode (GL_FRONT, GL_FILL);


glBlendFunc (GL_ONE, GL_ZERO);
glBindTexture (GL_TEXTURE_2D, texture1);
glBegin(GL_QUADS);
	glTexCoord2f(0., 0.); glVertex3f(-1,  1, 0.);
	glTexCoord2f(1., 0.); glVertex3f( 1,  1, 0.);
	glTexCoord2f(1., 1.); glVertex3f( 1, -1, 0.);
	glTexCoord2f(0., 1.); glVertex3f(-1, -1, 0.);
	glEnd(); 

glBlendFuncSeparate (GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA, GL_ONE, GL_ONE_MINUS_SRC_ALPHA);

glBindTexture (GL_TEXTURE_2D, texture2);
glBegin(GL_QUADS);
	glTexCoord2f(0., 0.); glVertex3f(-1,  1, 0.);
	glTexCoord2f(1., 0.); glVertex3f( 1,  1, 0.);
	glTexCoord2f(1., 1.); glVertex3f( 1, -1, 0.);
	glTexCoord2f(0., 1.); glVertex3f(-1, -1, 0.);
	glEnd();

        glDisable (GL_TEXTURE_2D);
	glDisable (GL_LINE_SMOOTH);
	glDisable (GL_BLEND);



How can i populate in the right way Mipmaps?
Many thanks for the reply! :slight_smile:

P.s. I use this Mipmaps in others parts of the program and they are ok, the problem comes up only when i try to draw something into the the MipMaps texture with glblend function …

You should break this down even more.

  • Read back the texels of level 0 and level 1. See if they are what you expect?
  • What texture format are you using for this texture?

if the above checks out, then for testing:

  • Get rid of the 2nd-pass render
  • Clear the screen to a solid color
  • Use this for the texture:
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_BASE_LEVEL, 0 );
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAX_LEVEL , 0 );
  • Render the first pass with blend and your BlendSeparate function, and see if you get what you spect
  • If so, change BOTH BASE_LEVEL and MAX_LEVEL to 1, and retry – make sure you get what you expect

Maybe the problem could be that my glx are to old, i use GLX 1.2 and in the program that i try to modified it calls GLX 1.3 indeed when the program starts it say me that “glXCreatePbuffer” is no supported in GLX 1.2 :-/ .
If i use :


glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_BASE_LEVEL, 1 );
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAX_LEVEL , 1 );

the texture blendig are not created, indeed i see a white surface where the texture must be “blended”, an indirect rendering can generate this problem? many many thanks for your reply!

P.s I tried to do the same things in an other pc with a nvidia geforce 6200 and GLX 1.3, here the problem doesn’t appear, everythings work good…
The pc with problem is an asus eeepc with linux Kamic Koala, video card intel gma 945 :-/

P.s. The texture format is GL_BGRA, GL_UNSIGNED_BYTE.

Try GL_RGBA8 for the internal format.

I’m no guru on the whole BGRA thing (AFAIK, that’s yet another Direct3D “I gotta be different for the helluvit” thing), but I think BGRA is not an internal texture format. It’s merely an external format that you can store your data in in CPU memory.

What is your texture’s internal format?

Now I create texture with:


glTexImage2D (GL_TEXTURE_2D,0,GL_RGBA8,w,h,0,GL_BGRA,GL_UNSIGNED_BYTE,cairo_image_surface_get_data (Surface));

but the problem occurs anyway only when i use :


glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_BASE_LEVEL, 0 );
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAX_LEVEL , 0 );

all works… with:


glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_BASE_LEVEL, 1 );
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAX_LEVEL , 1 );

textures are not correctly populated, the textures aren’t created, in this case I see White images…when I don’t use GL_TEXTURE_BASE_LEVEL and GL_TEXTURE_MAX_LEVEL but only:


glTexParameteri(GL_TEXTURE_2D, GL_GENERATE_MIPMAP, GL_TRUE);	

the textures vanish when they become small and in some case I see in the texture-mipmaps others images … it seems an internal error of pointer…

I’ve found the problem, the target texture where i copy the blend buffer (I don’t know why) with :


glCopyTexImage2D (GL_TEXTURE_2D, 0, GL_RGBA, x, y, iWidth, iHeight, 0);

in intel graphics card must be without MIPMAP, so the blend is without problems and it shows correctly :slight_smile: … but many many thanks for your help and see you soon :slight_smile:

Are iWidth and iHeight power-of-two ?

I didn’t controlled :slight_smile: I’ll try to do it and after I’ll post the result, thanks :slight_smile:

It was just a question, as most modern hardware handle non-power-of-two pretty well, but that is not the case with any Intel “3d” card…

Good question ZbuffeR :slight_smile: ,
this was the problem … with no mipmaps I didn’t see it but with mipmap my nvidia card adjust the case non-power-of-two well but my intel graphic card no :frowning: … do you know a tutorial that explain how can i solve this problem?
I can change only a bit iWidth and iHeight parameters… Many many thanks :slight_smile:

If you don’t need mipmaps (ie. texture is never seen reduced), you can try a look at texture rectangle, older but easier to support on some hardware.

Other option, it is possible to sort of support NPOT with mipmaps with POT-only hardware, provided you have enough texture memory available :

  • create an empty texture, with has width and height “ceiled” to next power of two (ex. for a 1517 texture, you need to create a 1632 texture)
  • fill in the part you actually have
  • then instead of using texcoords in the [0;1] range, use [0;15.0/16.0] for width and [0;17.0/32.0]