I wanted to verify if my shader code is doing the right thing for computing ambient light impact on surface color.
My ambient light direction is stationary (is not changing between frames) and so is my geometry (a building, for example). I’m passing these three values from vertex to fragment shader (plus other things):
passFragPos = vec3(modelViewMatrix * inPosition); passNormal = vec3(modelViewMatrix * vec4(inNormal, 1.0)); passLightPos = vec3(modelViewMatrix * lightPosition);
“inPosition” is the xyz of each vertex, “inNormal” is the face normal for each vertex, and “lightPosition” is direction of the ambient light.
This is my fragment shader:
vec3 objectColor = texture2D(texture1, passTextureCoord).xyz; //diffuse light color, base texture vec3 specular = texture2D(specular1, passTextureCoord).xyz; //specular light map vec3 n = normalize(passNormal); //normalize vertex normal (after model/view transform) vec3 lightDir = normalize(passLightPos - passFragPos); float d = max(0.0, dot(n, lightDir)); //compute angle between vertex normal and light vector (constant) vec3 color = (1.0 - d) * (ambient * objectColor + specular); //I changed d to 1-d because rendered color was getting very dark gl_FragColor = vec4(color, 1.0);
I’m not sure if all those computations are necessary, maybe they are. What I have now seems to work, when I rotate the camera around the stationary model (using lookAt) surfaces are getting lighter or darker, depending whether they face the light or are pointing away from it (ambient light is also stationary, always coming from the same direction).
And that (1.0 - d) part is of my own doing, otherwise everything goes dark when surface is not facing the light.
I’d appreciate it if someone who knows this stuff could share their opinion, if there is a way to improve or simplify this code.