[GLSL] Error matrix/vector multiplication

Hi!
I’ve this matrix :

400 0 0 0
0 300 0 0
0 0 0.5 0
400 300 0.5 1
And this vector :

0 -23.6964 -0.20002, 0

The multiplication in the shader is colum major so I should have this values :

x = 0400±23.69640±0.200020+0400 = 0
y = 00±23.6964300±0.200020+0300 = -7108.92
z = 00±23.69640+0.200020.5+00.5 = 0.10001
w= 00±23.69940+0.200020+01 = 0

Or in my shader the vector is null (0, 0, 0, 0)

tri.position[j] = viewportMatrix * t;
if (gl_GlobalInvocationID.x == x && gl_GlobalInvocationID.y == y && gl_GlobalInvocationID.z == tindex && j == 1) {
print(ivec2(0, 10), 1, vec4(1, 0, 0, 1), tri.position[j]);

                                                        }

Why ?
When I perform a raw-major multiplication with the CPU I get the correct result.

There is no such thing as “row-major” or “column-major” matrix multiplication. There is just matrix multiplication; the math does the same thing regardless. It’s the data in your matrix that can be interpreted as “row-major” or “column-major”.

Also:

This matrix is transposed. The translational component is supposed to go in the 4th column, not the 4th row.

Did you confuse the +/- symbol with the multiplication symbol? Also, why are there a bunch of leading zeros?

In any case, if you make your matrix data correct (ie: transpose it from what you’ve written) and do the correct matrix multiplication of the vector (ie: right-multiply the vector into the matrix) you claim to want to use, you’re supposed to get 400 for the X component.

So it’s not GLSL that’s the problem here.

Ok I’ve found the problem, I’ve just forgot to test if w > 0 before the division in the shader :

if (t.w > 0)
                                                                    t /= t.w;

                                                                tri.position[j] = viewportMatrix * t;

Now it works!
SOLVED!

… why are you dividing by W? That’s pretty much never a thing a shader should be doing.

Because it’s a compute shader and I need to pass from world coordinates to viewport coordinate to compute the interction of the triangle vertices with the ray (the ray is in viewport coordinates) and to pass from clip coordinates to viewport coordinates we need to divide by w.

There are no default pipeline with compute shader so I’ve to do every thing manually in the compute shader, even vertices interpolation.