Odd Results with the MESA drivers and gl_Vertex

I’m not sure whether to post this here or in the linux forum. But here it goes…

I have a varying variable that saves the raw z position in the vertex shader as such:

ZPosition = gl_Vertex.z;

Normally, this works great using Nvidia’s or ATI/AMD’s drivers. But when I ran this under the Mesa’s drivers, the x position got saved instead. As a matter of fact, the x position got saved using any of these lines:

ZPosition = gl_Vertex.x;
ZPosition = gl_Vertex.y;
ZPosition = gl_Vertex.z;

But produced sensible results when gl_Vertex was transform via:

gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;

This is very strange, esp. since according to glxinfo, this set of Mesa drivers should support up to opengl 2.1. Has anyone else had this issue with the Mesa drivers?

If this is indeed driver bug, you may be able to work around by adding some bolierplate code.
Since you know that gl_Vertex is ok and more complex operations on it are OK too, try dot’ing its with a value that will get you what you need, eg:

uniform float zero;  // not set elsewhere
ZPosition = dot(vec4(zero,zero,1,zero), gl_Vertex);

As your matrix computations are correct, this has a chance of producing good result as well - it may be just Mesa getting confused with too simple case.
The point of the excersie is to show that there is a bug in the driver - this is hardly a good workaround.

Of course, since you didnt show your shader its hard to blame the driver before more is known.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.