cubemap shadows for pointlights

I’m using opengl3.2

I was trying to implement point light shadows which I was able to but I have a self shadowing problem.

from what I understood I’m to render 6 passes of the scene for the view of the light (x, -x, y, -y, z, -z);

-I do this by attaching a cubemap to the framebuffer.

-Then do a depth_attachment to the framebuffer after setting up the cubemap texture which i set each face to be a depth a format texture.

-by only creating a render buffer that’s for the depth attachment of the framebuffer (which if I understand correctly can only be value from 0 to 1).

-I create the views using my own lookat matrix that works just like gluLookAt.

-I render all objects with each individual viewMatrix for xyz and -xyz for the cubemap.

I do the depth shader like so

Vertex Shader

VertexPosition = inVertices * worldMatrix;
vec3 lightDir = lightPosition -;
lightDir /= zFar;
gl_Position = VertexPosition * viewMatrix * projectionMatrix;

Fragment Shader

gl_FragDepth = length(lightDir) + offset;

I didn’t see that gl_FragDepth was deprecated so I used it and i read that the depth-buffer isn’t really linear but log linear so i would assume this would make it linear).

Then Finally I render the scene regularly.

In the Fragment Shader

samplerCubeShadow shadowMap;

vec3 lightDirection = -;
float distance = length(;
float depth = texture(shadowMap,;

if( (distance/zFar) > depth)
//In Shadow

if I don’t divide by zFar the values won’t be correct (they’ll stop at 1.0 for the texture). I see directx tutorials not doing this which I assume some how that it does that for them or the values for a direct3d depth buffer is not limited to [0,1].

If I do this then everything seems to work except when I move my light up the cube objects I have in my objects get thrown into a self-shadow which doesn’t make sense to me.

I get something like this:

I could really use some help. I don’t have any problems implementing other light type shadows but this one.

Not sure I can really help with this but observe that your matrix multiplication order is backards from the convention. Are you using row major instead of column major matricies?

I’d have thought you should be calculating:

gl_Position= Projection * cameraview * modelmatrix * inVertices;

My matrices are column major so that I don’t have to transpose them when feeding to the shader but I haven’t seen anything odd about my order since the scene comes out correctly (except for objects self shadowing themselves specific the cubes for some reason).

I’m pretty sure my order is okay.