I have some question regarding implementing variance shadow mapping

I follow the NVIDIA GDC slide and implement variance shadow mapping function as follow

```
//setting
1.used GL_RGBA32F_ARB to store depth & depth^2
2.the resolution of shadow map is 512*512
3.near and far plane for both scene and depth rendering is 1 and 500
4.used seperate guassian blur to blur the depth & depth^2 texture
```

```
float getLightContribution(vec4 shadowCoordinateWdivide)
{
float dist_to_light = shadowCoordinateWdivide.z;
vec2 moments = texture2D(shadowTexture,shadowCoordinateWdivide.xy).rb;
float litFactor = 0;
if(dist_to_light <= moments.x){
litFactor = 1.0;
}
//float variance = 0.00000001;
float variance = moments.y - (moments.x*moments.x);
float m_d = moments.x - dist_to_light;
float p_max = variance / (variance + m_d*m_d);
return max(p_max,litFactor);
}
```

and I get no shadow whatever but if I replace variance calculation with some small fix value

like “float variance = 0.00000001” , the shadow appeared and look somewhat correct.

But I doubt this is correct since I ignore the whole variance calculation.

Can somebody explain this to me since there are no mention of this problem in the GDC slide.

my video card is ATI4670 with 9.12 driver

this is the image when shadow work (using variance = 0.00000001).

this is the image when shadow didn’t work (calculate variance as it suppose to be).