Writing color and depth in GLSL 1.40

Hi

I’m trying to write to both a color and depth buffer in a fragment shader where both are bound to a FBO.
Color is bound as a color attachment, and depth as a depth attachment.

In older GLSL versions you would use:

gl_FragData[0] = colordata;
gl_FragDepth = depthdata;

But in GLSL 1.40 gl_FragData is depreciated and you need a user defined out variable for color buffers. However when I use this I get the linker error: “built-in and user-defined fragment shader outputs may not be used at the same time”.

Looking trough the GLSL 1.40 documentation it doesn’t seem like gl_FragDepth has been depreciated, so in order to write depth I have (?) to use a built-in shader output, which apparently is not going to work when a user defined output is also used.

Does anyone know how to do this correctly in GLSL 1.40, or is the NVIDIA linker at fault?

Thanks

It’d help if we could see your fragment shader code.

The code isn’t that interesting, I’m just trying to write both a color value and a depth value in a single pass, which in GLSL < 1.30 would be something like:



void main()
{		
	float depth;
	vec4 someData;
	
	gl_FragData[0] = someData;
	gl_FragDepth = depth;
}


and in GLSL >= 1.30



out vec4 fragData;

void main()
{		
	float depth;
	vec4 someData;
	
	fragData = someData;
	gl_FragDepth = depth;
}


However it doesn’t allow me to run the second example because of the linker error. So I was wondering if maybe you should use a user defined out variable for depth as well, or if there is a problem with the linker.

I just noticed OpenGL 3.2 is already out (I was using 3.1) so I installed that to see if it would fix things, but now I’m getting a ton more depreciation errors in my shaders, which would work perfectly fine in 3.1, arg :frowning: .

I’m at the point were I think maybe I should fully upgrade all my stuff to the new standards (which is going to be some work) instead of this mixed compatibility mode I am using now, but I’m not really sure of the benefits. Seems to me your going to have to emulate a lot of stuff in your own code that OpenGL used to do for you in the past :stuck_out_tongue:

OpenGL 3.2 + GLSL 1.5 core is peaches and cream.

Works for me on:
GL3.1 + glsl #version 140
GL3.2 + glsl #version 150
GL3.2 + glsl #version 150 core

190.57 drivers, haven’t tried .58 yet

After upgrading to opengl 3.2 (while also converting everything to forward compatibility mode) the shader now runs correctly. So it seems like it was a linker bug in the nvidia opengl 3.1 drivers.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.