I was thinking…
If you take a triangle and scale with emphasis on a single axis its normal would change. Right?
If you have a display list (terrain for example) with normals and call glScale before you call/draw the list does OpenGL compensate?
Or say there is a very sharp needle cone and a light pointing directly down. The sides of the cone would be nearly parallel to the light and therefore not be lit. But if you scaled the cone down in Y axis to the point where it was a 2d circle, the sides would be facing the lighting perfectly and should be lit.
I would suspect that it does, because glScale has a similar behavior to glRotate, but for some reason my terrain’s lighting is very strange/wrong.
I don’t think that normals are scaling when you scale the object. Just think, how can OGL do it?
I see, so scaling display lists is not a good idea?
After a little more experimenting I gather that when I scale an object using glScale it progressively becomes darker and darker over all. As if the normals are becoming adjacent. Even if the scale factors are equal in all 3 dimensions.
Normals are transformed by the inverse transpose of the modelview matrix.
Draw a circle with normals on paper and then squash it to an ellipsis and you’ll see that normals need to be transformed differently than vertices.
To compensate for the normal scaling you just need to glEnable(GL_NORMALIZE) and they are normalized before lighting by the OpenGL implementation. On HW accelerators this normalization comes almost for free today. Don’t worry.
In fixed function pipeline the normals are multiplied by inversed and transposed upper 3x3 corner of the modelview matrix so their length changes when the mesh is scalled.
To compensate this you need to enable normal renormalization using glEnable( GL_NORMALIZE ). If the scale is uniform in all directions, you can enable normal rescalling glEnable( GL_RESCALE_NORMAL ) which might be faster on older hw.
Ok I see, and I got it working easily with GL_NORMALIZE thanks.