# Fixed-function Light Attenuation

With vertex programs I can specify a directly linear attenuation of point/spot lights.
I can also use attenuation texture maps for per-texel attenuation.
I can do per-pixel attenuation in some kind of fragment program.

But I need to have attenuation working properly in the fixed-function path.
So that when an object is outside the sphere of influence of a light, it stops being rendered - without it ‘popping out’.

I can’t get anything useful out of the standard GL attenuation equation:
I want my lights to have full intensity at a distance of 1, and absolutely 0 (or as near as drops off the bottom of a bytes precision) intensity at a distance of ‘sphere radius’ or ‘spotlight frustum-far-plane’…with a very linear range in between (such as vertices at a distance of half the radius would map to half intensity).
Has anyone any tips?
Or do I have to accept the current system, and make my light spheres HUGE in order to compensate for the strange attenuation mapping?

Using the standard attenuation function provided by OpenGL, you cannot fade a light completely to zero, nor can you fade it linearly.

The attenuation function OpenGL uses is a faily acurate model of how light attenuates (counting the fact that it’s quite simple aswell). A light will, theoretically, never be zero even at huge distances, and it generally fades as a function of the distance squared.

But I’m not that interested in the attenuation being realistic, more interested in it being controllable.
So that’s it? There’s no common extension to change this equation? (in the same way they introduced the blend_equation extension to enable changing of the blend equation).

Other than using vertex programs, I’m afraid there are no extensions to control the attenuation factor.

i agree with Bob, you should use vertex programs, i’ve never heard about any extensions that can give you that kind of control over lighting in opengl.

There is an extension, it is called “do_it_yourself_ARB”

You can do your own type of lighting and calculate lighting values for your vertices. You then supply them to the fixed function pipeline as vertexcolors (dont forget to glDisable(GL_LIGHTING)).
Of course now the cpu does the lighting calculations. But the cpu would also do it on cards that dont support vertex programs.
As an advantage you can optimize for static scenes, when light and object dont move, you dont need to recalculate the lighting again.

Lars

Very funny.
I don’t want to do my own lighting calcs, because the vertex data will probably be in uncached memory (whether it be in a display list, or VAR, or VOB). Plus it would have to be done effectively every frame because of view-dependent specular. So that doesn’t make sense.
I don’t want to use the emulated vertex programs either, for the same reason (the CPU emulation will read/write to uncached memory).
Basically, I want to retain the performance I get from the fixed pipeline on <=gf1 class hardware, while being able to modify the attenuation calculation slightly.
But it seems it is not possible, so thank you and good night chaps!

Originally posted by KuriousOrange:
Basically, I want to retain the performance I get from the fixed pipeline on <=gf1 class hardware, while being able to modify the attenuation calculation slightly.

Actually, on <gf1 class HW, the lighting calculations ARE done on the CPU (even fixed function). The only cards that would suffer much from using vertex programs (vs using fixed function) would be GF1&GF2. On <=TNT2 you get CPU, regardless if it’s fixed function or vertex programs, and on >=GF3 you have vertex programs anyway. Hmm… GF4 MX?

Have you really thought this one through? If your object is lit by more than a single light source, then the “popping” effect would be close to impossible to notice - of course, depending on the environment etc. Perhaps you can get away with fixed function GL vertex lighting, with a bit of tweaking - at least as a performance enhancer on certain HW?