No Matrix-Vektor-Mult in Fragment-Shader with ATI on Linux


I have a strange Problem here. If i use this simple Fragment-Shader:

varying vec4 shaded_color;

void main(void) {
   gl_FragColor = shaded_color;

everything works fine. Do i change it like this

varying vec4 shaded_color;

void main(void) {
   mat4 test = mat4(
   1.0, 0.0, 0.0, 0.0,
   0.0, 1.0, 0.0, 0.0,
   0.0, 0.0, 1.0, 0.0,
   0.0, 0.0, 0.0, 1.0	
   vec4 textVec = test * gl_FragCoord;

   gl_FragColor = shaded_color;

all i get is a black window. It doesn’t matter what matrix i multiply with what vector or if they are 2, 3 or 4 dimensional. If i multipy a matrix with a vector, the fragment-shader seems to discard the fragment. But i dont get any errormessage.

I use Gentoo-Linux with a Radeon 9600Pro and latest ATI-drivers.

Any Idea?

i was wrong: it only crashed if i use gl_FragCoord as vector. With vec4(1.0, 2.0, 3.0, 4.0) instead it works. But i realy need gl_FragCoord…

Im not sure (somebody has to correct me if I wrong) but I think that ATI doesn’t support gl_FragCoord as well as derivates (dFdx, dFdy, fwidth).


I didn’t get what are you using textVec for… ATI should support glFragCoord, but doesn’t support dFdx, dFdy etc.


there is no sense in the calculation of testVec, it is only for testing.

I just tried it on a NVidia-Card (FX5700), and here everything works fine. Seems to be a Problem of ATI.

Well, this is no real proof. :wink:
The compiler should remove useless instructions and keep only this:
varying vec4 shaded_color;
void main(void) {
gl_FragColor = shaded_color;
You need to source gl_FragCoord into an output to be sure.

I just tried:

gl_FragColor = normalize(gl_FragCoord);

wich produces a nice color-gradient on NVidia, and a black window with ATI.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.