I need to draw some stars. Depending on the position and some astronomical parameters, an equation gives me the size and color of each point.
To avoid problems with too small anti-aliased points, I have a parameter which is the smaller size I can use for my stars. If the star need to be smaller, the equation uses the color instead of the size to change the luminosity of the star.
Now, I have tried the glPointParameterARB. It smells good to me because I do not have to calculate the attenuation in the CPU anymore.
My problem is that the stars never become smaller than 1, even when they are very far (the should light off). They never disappear.
Someone could help me?
I never done that yet. And I guess you have checked the extension: http://www.opengl.org/registry/specs/ARB/point_parameters.txt
Sorry for just the guesses, but do you use alpha functions or blending, are they enabled ?
Other solutions would be to endark yourself the colors or not to draw the point at all.
Hope that could help a bit.
I agree with your “other solution”. But if I do this, I have to compute the size of each point, and I lose the advantage of the point_parameters extension.
Yes, the blending is enabled.
But I have missed something with the FADE_THRESHOLD parameter, so I will try this out.
I have posted a more complete and detailed question about this extension in the “advanced” forum.
It seems that the FADE_THRESHOLD is exactly the solution for my problem.