texture_lod_bias problem

Hi all,

After finishing my DDS loader ( special thanks to jra101 ), I’m trying to achieve a blur effect so I could do a fade-out or a fade-in of the detail of the texture. My first approach was to specify a mipmap using texture_lod and setting both BASE_LEVEL and MAX_LEVEL at the same value… of course that doesn’t give me a smooth transition, so I’m now playing with texture_lod_bias extension which ( following the spec ) ables me to modify the lambda used to generate the fragments.
The problem is: i don’t know how this works, i don’t know what values of lambda give me the desired results, when i query for MAX_TEXTURE_LOD_BIAS_EXT it returns me 1, that looks strange, isn’t it?

Whatever values I set beetween -1 and 1 nothing seems to happen, even using very bigs numbers

Please i need some help on this, ¿am I missing something on the spec?

Thanks in advance, if it helps… i’m using a Radeon 7500, but i hope that this silly effect could be done in most cards.

[This message has been edited by d-slut (edited 03-26-2003).]

I think the LOD bias needs to be positive. What’s the use of a negative bias?

If you set the LOD to 1, then there will be a difference. If you don’t notice anything then the problem is elsewhere.

This is one of the easiest function to use actually.

Hi again,

Maybe I miss something reading the spec (english insn’t my natural lang) but I think that the spec says that I can use for lodBias all values between -MAX_TEXTURE_LOD_BIAS_EXT and MAX_TEXTURE_LOD_BIAS_EXT.

Copy and paste from the spec:

"The choice is governed by a scale factor p(x,y), the level of detail
parameter lambda(x,y), defined as

             lambda'(x,y) = log2[p(x,y)] + lodBias

 where lodBias is the texture unit's (signed) texture lod bias parameter
 (as described in Section 3.8.9) clamped between the positive and negative
 values of the implementation defined constant MAX_TEXTURE_LOD_BIAS_EXT."

So, because the MAX_TEXTURE_LOD_BIAS_EXT in my driver is 1 I’m testing values between -1.0f and 1.0f, but nothing seems to happen.

I think the problem is elsewhere, I’m using linear magnification and linear_mipmaping for minification filtering, you know, this way:

glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR );
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR_MIPMAP_LINEAR );

and GL_REPEAT for wrapping…

Well, if someone knows anything on this (how to use the extension or where to look for errors, filtering or mipmap related) I’ll be very happy :slight_smile:

Thanks to all.