VSM Shadows Falloff inverted

Hi all. I’ve been playing with PCF and VSM for a while now and I really cant get the effect I’m after.

I tried a little PCSS and that really killed the framerates. I went back to VSM because that is acknowledged to be the superior method but there is one major snag that I have with it. The falloff term seems to be incorrect (or that im using pmax incorrectly)

float chebyshevUpperBound()
	// We retrive the two moments previously stored (depth and depth*depth)
	vec4 moments = btex2D(ShadowCoordPostW.xy,blurAmount,8.0);
	//vec2 moments = texture2D(ShadowMap,ShadowCoordPostW.xy).rg;
	// Surface is fully lit. as the current fragment is before the light occluder
	// Hardly ever occurs because the distances will always be greater or very close to equal
	if (ShadowCoordPostW.z <= moments.x)
		return 1.0 ;

	// The fragment is either in shadow or penumbra. We now use chebyshev's upperBound to check
	// How likely this pixel is to be lit (p_max)
	float variance = moments.y - (moments.x * moments.x);
	variance = max(variance, minVariance);

	float d = ShadowCoordPostW.z - moments.x ;
	float p_max = variance / (variance + d * d);
	return p_max;

Note the term: float p_max = variance / (variance + d * d); This suggests that p_max tends towards 1.0 as the distance gets smaller. That is completely insane IF p_max is the chance of occlusion. That would mean the shadow get stronger the greater the distance is between the blocker and the reciever.

BUT, I’ve checked GPU Gems3, the original vsm paper, the Ogre forums and several other places and they ALL have the same GLSL code.

Take a look at this:

You can see that the falloff is inverted. Thats not right.

Now I’ve tried the 1.0 - p_max trick and that doesnt work either! :S I suspect it is something to do with the way I apply p_max:

float lightLevel() {
	float d = ShadowCoordPostW.z - lightPosition.z;
	float attenuation = 1.0 / (d * d * lightAttenuation);
	return attenuation * max(dot(vertexNormalWorld, -lightDir.xyz), 0.0);

void main()
	ShadowCoordPostW = 0.5 * (ShadowCoord.xyz / ShadowCoord.w + 1.0);

	float shadow = ReduceLightBleeding (chebyshevUpperBound(),0.1);
	float litFactor = (1.0 - ambientLevel) * shadow * lightLevel();
	gl_FragColor = gl_Color * (litFactor + ambientLevel);

Has anyone played with VSM much and had a similar issue? Cheers :slight_smile:

interesting as I too have played with VSM and no doubt using the same GLSL source as you. I concluded I had to use

return max (1.0 - p_max, 0.0);

as the result of the chebyshevUpperBound function because returning p_max on it’s own was not working very well.

I’m also curious to know what values you are using for minVariance

variance = max(variance, minVariance);

to reduce light bleeding with VSM.

Yeah, why VSM does this I have no idea and its really annoying. How can shadow attenuation follow that rule? Clearly something is wrong with the way im implementing things.

0.00002 and 0.02 depending on the effect I want seem to work the best for the minVariance. When its low, you see too much of that effect which really sucks.