I run this on my machine which has an nvidia card with GLSL 1.3 and it works fine. I have a user that tried to run in it on their machine, and they have an ATI graphics card with OpenGL 2.1 and GLSL 1.2, and it does not set the fragment alpha value.
Is there something that wouldn’t work here on OpenGL 2.1/GLSL 1.2? I’m stumped…
I was also having the same issue with a fragment shader that was not changing the alpha, but rather changing the color. That machine had the same problems that the shader wouldn’t work.
Presumably the problem with this shader is that it does not always set the gl_FragColor - because it depends upon the incomming per-vertex color attribute being > 0.
I’ll have our users test that out if possible. However, the shader does work on most graphics cards. My PC runs GLSL 1.3 and it runs fine. The one we have problems with has GLSL 1.2, so I wouldn’t think it would work on one but not the other. But we’ll give it a shot.
One more thought. GLSL v120 does not support integers. Some compilers don’t like values specified as if they were integers and need to be floats. So the above if statement should be
Hmm…still no luck. I’m completely stumped. I have it working on numerous machines except on this Dell Optiplex 980 with an ATI Radeon 3450 (latest drivers from Dell, not from AMD though).