I hit a very strange Z-buffer interpolation issue while working on 2D shadow mapping. I’ll try to explain it as best as I can using simple example that still reproduces the issue.
I render just one 2D line into a shadow map texture of height 1px. In vertex shader, I calculate X coordinate of each vertex simple as angle at which light ray hits this vertex. Y coordinate is always zero, since texture is one-dimensional. And Z coordinate is defined as normalized distance from the vertex to the light source.
Here’s an image that explains this process:
[ATTACH=CONFIG]1733[/ATTACH]
This “shadow map” texture is then used to draw the actual shadows.
Everything works as expected until light ray perfectly aligns with the line. See image below:
[ATTACH=CONFIG]1734[/ATTACH]
When this happens, both vertices are mapped to the same fragment, which would be fine. But this fragment Z coordinate is wrong: somehow it is less then minimum distance from light source to both vertices.
If you look at the shadow, it can be seen as a shadow spike that extends towards light source past line segment. Please see this animation as an example:
https://imgur.com/a/qMRz1
or this zoomed screenshot:
[ATTACH=CONFIG]1735[/ATTACH]
Vertex shader:
#define PI 3.14159265
#define RAY_LEN 1000.0
uniform vec2 u_LightPos;
float CalcAngle(vec2 v, vec2 light)
{
vec2 r = v - light;
return atan(r.y, r.x);
}
void main()
{
float angle = CalcAngle(gl_Vertex.xy, u_LightPos);
float x = angle / PI; // [-PI, PI] -> [-1, 1]
float z = length(gl_Vertex.xy - u_LightPos) / RAY_LEN; // [0, 1]
gl_Position = vec4(x, 0.0, z * 2.0 - 1.0, 1.0);
}
Fragment shader:
#version 330
uniform vec2 u_LightPos;
out vec4 frag_color;
void main()
{
float z = gl_FragCoord.z;
frag_color = vec4(z, z, z, 1.0);
}
I’ve been trying to find a cause of this issue for a several days now. I’ve tested it on multiple devices and platforms (Linux, Android), using multiple frameworks (SFML, Cocos2D), GL versions etc. I’m fairly certain that this is not a hardware or driver bug, since it always behaves consistently.
I guess that somehow OpenGL fails to correctly do a Z value interpolation when both line vertices are mapped very close. But this is mathematically impossible: there’s no way to get a value X that is less then min(a, b), while interpolating between a and b.
I’d really appreciate any suggestions or ideas of how to find out why this happens.