Fragment shader not working in OpenGL 2/GLSL 1.20

I have a fragment shader program that is the following code:

#version 120
void main(void) {
    if ( gl_Color.a > 0 ) {
        gl_FragColor.rgb = gl_Color.rgb;
        gl_FragColor.a = 0.2;

I run this on my machine which has an nvidia card with GLSL 1.3 and it works fine. I have a user that tried to run in it on their machine, and they have an ATI graphics card with OpenGL 2.1 and GLSL 1.2, and it does not set the fragment alpha value.

Is there something that wouldn’t work here on OpenGL 2.1/GLSL 1.2? I’m stumped…


The obvious thing is that you aren’t outputing to gl_FragColor. You should probably init it to vec4(0).

Besides that, it could be that it isnt a shader problem. It is a buffer problem.

V-man, thanks. What do you mean it could be a buffer problem?

Perhaps you don’t have a alpha buffer.
You can check it with glGetIntegerv(GL_ALPHA_BITS, integer) if you are using a backward compatible context.

I was also having the same issue with a fragment shader that was not changing the alpha, but rather changing the color. That machine had the same problems that the shader wouldn’t work.

So what happens if you take the IF statement out:

#version 120
void main(void) {
    //if ( gl_Color.a > 0 ) {
        gl_FragColor.rgb = gl_Color.rgb;
        gl_FragColor.a = 0.2;
   // }

Presumably the problem with this shader is that it does not always set the gl_FragColor - because it depends upon the incomming per-vertex color attribute being > 0.

I’ll have our users test that out if possible. However, the shader does work on most graphics cards. My PC runs GLSL 1.3 and it runs fine. The one we have problems with has GLSL 1.2, so I wouldn’t think it would work on one but not the other. But we’ll give it a shot.

if ( gl_Color.a > 0 

One more thought. GLSL v120 does not support integers. Some compilers don’t like values specified as if they were integers and need to be floats. So the above if statement should be

if ( gl_Color.a > 0.0)

Thanks, I’ll give that a try as well.

Will the version number cause problems as #version 120 since 120 is an integer also?

That’s how the version is written.
I don’t know why you have to leave the “.” out, but that is what the ARB decided.

Hmm…still no luck. I’m completely stumped. I have it working on numerous machines except on this Dell Optiplex 980 with an ATI Radeon 3450 (latest drivers from Dell, not from AMD though).

It turns out there are other issues with some rendering to a frame buffer I am doing. I am making an additional post regarding that.

So you could make it work on other platforms except Dell Optiplex 980. Why don’t you install the official Cat11.5 to have a try?

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.