I’m using opengl3.2
I was trying to implement point light shadows which I was able to but I have a self shadowing problem.
from what I understood I’m to render 6 passes of the scene for the view of the light (x, -x, y, -y, z, -z);
-I do this by attaching a cubemap to the framebuffer.
-Then do a depth_attachment to the framebuffer after setting up the cubemap texture which i set each face to be a depth a format texture.
-by only creating a render buffer that’s for the depth attachment of the framebuffer (which if I understand correctly can only be value from 0 to 1).
-I create the views using my own lookat matrix that works just like gluLookAt.
-I render all objects with each individual viewMatrix for xyz and -xyz for the cubemap.
I do the depth shader like so
VertexPosition = inVertices * worldMatrix;
vec3 lightDir = lightPosition - VertexPosition.xyz;
lightDir /= zFar;
gl_Position = VertexPosition * viewMatrix * projectionMatrix;
gl_FragDepth = length(lightDir) + offset;
I didn’t see that gl_FragDepth was deprecated so I used it and i read that the depth-buffer isn’t really linear but log linear so i would assume this would make it linear).
Then Finally I render the scene regularly.
In the Fragment Shader
vec3 lightDirection = lightPosition.xyz - VertexPosition.xyz;
float distance = length(lightDirection.xyz);
float depth = texture(shadowMap, -lightDirection.xyz);
if( (distance/zFar) > depth)
if I don’t divide by zFar the values won’t be correct (they’ll stop at 1.0 for the texture). I see directx tutorials not doing this which I assume some how that it does that for them or the values for a direct3d depth buffer is not limited to [0,1].
If I do this then everything seems to work except when I move my light up the cube objects I have in my objects get thrown into a self-shadow which doesn’t make sense to me.
I get something like this:
I could really use some help. I don’t have any problems implementing other light type shadows but this one.