I’m rendering spheres as points (I’m computing normals in fragment shader)
I have a scene with a sphere, and a directional light. I can rotate and move camera around scene.
The problem I have is that when I rotate camera the sphere shading changes - I mean the sphere diffuse intensity changes becouse the angle between lightDirection vector and sphere normal changes.
I can’t just multiply normals by camera rotation matrix, as I do with light direction vector becouse this will cause the shading of sphere to be constant regardless of the camera position.
I want the shading of sphere to change accordingly to camera position and not to change when camera is rotated.
So my question is how should I transform normals to achieve that?
Ok, thanks - it has slightly better performance now, but operations you suggested don’t remove my problem.
here are two screenshots from my application, ligt direction vector is (-1, -1, -1), and on both camera is in the same position but it’s rotation is different:
I don’t want shading to change when camera rotates.
I know, but I need light direction vector to be in camera space (I’m only multiplying it by rotation part of camera matrix).
I’ll try to explain my problem more:
I’m using frame (forward and up vectors and position) to represent camera.
I generate normals so that the central pixel of point (sphere) has normal (0.0, 0.0, 1.0) in camera space.
Normals I generate in fragment shader are already in camera space so I need only to transform light direction to camera space.
Everything is fine until forward vector points directly to the center of the sphere.
But if I rotate camera, light direction coordinates in camera space changes and normal coordinates don’t (so central pixel still has (0, 0, 1)). Thus
angle between normal and light changes.
here is a picture that illustrates this:
simply not transforming light direction into camera space is not a solution for me, becouse as I said, when I change camera position and look at sphere, I want the shading to change.