interpolating between textures

I want to fake a blur effect in my game. for this reason, i want my engine to interpolate between a blurred and non-blurred version of my texture. i currently do this using the texture_env_combine extension, like this:

GLfloat color[4]={1,1,1,gBlurMapFactor};
glTexEnvi(GL_TEXTURE_ENV,GL_TEXTURE_ENV_MODE,GL_COMBINE_ARB);
glTexEnvi(GL_TEXTURE_ENV,GL_COMBINE_RGB_ARB,GL_INTERPOLATE_ARB);
glTexEnvi(GL_TEXTURE_ENV,GL_COMBINE_ALPHA_ARB,GL_INTERPOLATE_ARB);
glTexEnvi(GL_TEXTURE_ENV,GL_SOURCE2_RGB_ARB,GL_CONSTANT_ARB);
glTexEnvi(GL_TEXTURE_ENV,GL_SOURCE2_ALPHA_ARB,GL_CONSTANT_ARB);
glTexEnvi(GL_TEXTURE_ENV,GL_OPERAND2_RGB_ARB,GL_SRC_ALPHA);
glTexEnvi(GL_TEXTURE_ENV,GL_OPERAND2_ALPHA_ARB,GL_SRC_ALPHA);
glTexEnvfv(GL_TEXTURE_ENV,GL_TEXTURE_ENV_COLOR,color);
the problem is, that lighting will only effect the first texture, but not the second one, so when i interpolate to all blurred, my texture appears much too bright. does anyone have a good idea how to interpolate between two textures without giving up lighting? a two-pass solution would also be ok if this is not possible using multitexturing.

any help appreciated,
jonas

Originally posted by jechter:
I want to fake a blur effect in my game. for this reason, i want my engine to interpolate between a blurred and non-blurred version of my texture. i currently do this using the texture_env_combine extension, like this:

Have your tried using 3D textures and do the interpolation between two images?

It does impose the restriction that the images must be of the same resolution, but it does allow for other affects such as varying the R coordinates across your geometry.

Robert.

Originally posted by Robert Osfield:
[b] Have your tried using 3D textures and do the interpolation between two images?

It does impose the restriction that the images must be of the same resolution, but it does allow for other affects such as varying the R coordinates across your geometry.
[/b]

thanks for the tip… using 3d textures to interpolate between two textures works quite nice, the only problem is that Mip-mapping doesn’t seem to work (and if it would, it would probably ruin my interpolation).
Also, can i expect 3d texturing to ‘just work’ and be reasonably fast on most hardware?

jonas

Originally posted by jechter:
[b] thanks for the tip… using 3d textures to interpolate between two textures works quite nice, the only problem is that Mip-mapping doesn’t seem to work (and if it would, it would probably ruin my interpolation).
Also, can i expect 3d texturing to ‘just work’ and be reasonably fast on most hardware?

jonas[/b]

You’re right about mipmapping doesn’t work well on textures that arn’t well balanced in the three dimensions. Turning it off mip mapping will obvious address this issue completely, but could lead to aliasing problems.

Most modern graphics cards appears to support 3D textures, and have done for a number of years now.

Robert.

Most modern graphics cards appears to support 3D textures, and have done for a number of years now.
The GeForce2 supports 3d texturing in software, and software texturing is very slow.

Here’s a two-pass technique that should work. For the first pass, draw the geometry with the first texture. Then enable blending and set the blend function with glBlendFunc( GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA ). On the second pass, bind the second texture, with the texture function set to GL_MODULATE, and set the color with glColor4f(1,1,1,a) with a in the range (0,1). When a is 0, only the first texture is visible, and when a is 1, only the second is visible. It seems like this would be reasonably fast and would be supported by most hardware.

Originally posted by Aaron:
Here’s a two-pass technique that should work. For the first pass, draw the geometry with the first texture. Then enable blending and set the blend function with glBlendFunc( GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA ). On the second pass, bind the second texture, with the texture function set to GL_MODULATE, and set the color with glColor4f(1,1,1,a) with a in the range (0,1). When a is 0, only the first texture is visible, and when a is 1, only the second is visible. It seems like this would be reasonably fast and would be supported by most hardware. [/b]

This won’t work for my game. The problem is that the textures i want to blend between have an alpha channel and already use blending. Also, i’ve found that using glColor won’t have no effect when lighting is enabled (correct me if i’m wrong).
A two-pass solution i’ve been thinking of is to create a transparent only alpha-texture, and interpolate between that and my first texture (using the multitextureing method descibed above), and then in the second pass interpolate between the alpha texture and the second texture. But if i don’t experience big performance problems with 3d texturing i might stick with that (and possibly implement mip-mapping by hand).

Originally posted by jechter:
Also, i’ve found that using glColor won’t have no effect when lighting is enabled (correct me if i’m wrong).

glColor will affect the colour when lighting is enabled but only when glColorMaterial() is enabled via glEnabled(GL_COLOR_MATERIAL). It works by substituting the glColor value into the current material, if its not enabled then the glColor value is ignored.

Robert.

This won’t work for my game. The problem is that the textures i want to blend between have an alpha channel and already use blending. Also, i’ve found that using glColor won’t have no effect when lighting is enabled (correct me if i’m wrong).
Oops, I guess I assumed that because I always enable COLOR_MATERIAL everyone else does too. The alpha channel in the textures does complicate things. However, I think you’ll find that 3d texturing in software is too slow for realtime rendering. For example, the 3d texturing demo that comes with OpenSceneGraph runs at less than 3 fps on my computer (GeForce2 IGP, Athlon XP 1800+), and it just draws a single textured quad.

BTW, Robert, any plans to add a RenderToCubemapStage class for the next release of OSG? (I’d post to the mailing list, but I’m subscribed to enough lists already.)

[This message has been edited by Aaron (edited 06-29-2003).]

jechter, your source code in the first post isn’t complete.

Is this what you are trying to do?

(tex0primarycolor)(color.alpha)+(tex1primarycolor)(1-color.alpha)

That’s with 2D textures.

Originally posted by Aaron:
However, I think you’ll find that 3d texturing in software is too slow for realtime rendering. For example, the 3d texturing demo that comes with OpenSceneGraph runs at less than 3 fps on my computer (GeForce2 IGP, Athlon XP 1800+), and it just draws a single textured quad.

Time to upgrade your graphics hardware :slight_smile:

My FX5600 runs osgtexture3D at 210Hz. The Geforce3 generation onwards all support hardware 3D textures.

Originally posted by Aaron:
[b]BTW, Robert, any plans to add a RenderToCubemapStage class for the next release of OSG? (I’d post to the mailing list, but I’m subscribed to enough lists already.)
[/Q]

If you’re tracking the latest in CVS then you’ll find an example osgprerendercubemap which does just what you want. 0.9.4 (and versions before this) are also capable of pre rendering cubemaps too, just we didn’t have an example of it.

Robert.