So, I’ve been writing a geometry shader for expanding a point out to a quad for drawing billboards. It wasn’t working correctly, so I simplified it to ignore the vertex shader output and just generate a quad in the middle of the screen:
layout ( points ) in;
layout ( triangle_strip, max_vertices = 4 ) out;
out vec2 gsTexCoord;
out vec4 gsColour;
void main()
{
gl_Position = vec4(-0.5, -0.5, 0, 1);
gsTexCoord = vec2(0, 0);
gsColour = vec4(0.1);
EmitVertex();
gl_Position = vec4(0.5, -0.5, 0, 1);
gsTexCoord = vec2(0, 0.5);
gsColour = vec4(0.2);
EmitVertex();
gl_Position = vec4(-0.5, 0.5, 0, 1);
gsTexCoord = vec2(0.5, 0);
gsColour = vec4(0.3);
EmitVertex();
gl_Position = vec4(0.5, 0.5, 0, 1);
gsTexCoord = vec2(1, 1);
gsColour = vec4(0.4);
EmitVertex();
EndPrimitive();
}
With a fragment shader which just returns constant vec4(1), I get the expected output:
[ATTACH=CONFIG]173[/ATTACH]
However, if I change the fragment shader to return gsColour, the positions of the quad vertices change:
in vec2 gsTexCoord;
in vec4 gsColour;
out vec4 Colour;
void main()
{
Colour = gsColour;
}
[ATTACH=CONFIG]174[/ATTACH]
In fact, if you look closely at the geometry shader code, you may notice that the texture coordinates are being output as gl_Position!
It would seem, that reading gsColour in the fragment shader is somehow causing the x and y values of gl_Position to be overwritten by the values stored in gsTexCoord.
Please, someone explain what incredibly stupid thing I’m doing wrong here?