I use Transform Feedback to speed up calculations on vertices.
In this scenario, I transmit to the shader the coordinates of the vertices with 3 floats and the color with 1 int. The result is retrieved into a ByteBuffer (Java) with the interleaved data.
For all vertices the color 0xFFFFFFFF is transmitted.
On Macos the color retrieved is indeed 0xFFFFFFFF as expected, whether the type passed to glVertexAttribPointer is GL_INT or GL_UNSIGNED_INT.
But on Windows, the color retrieved is 0xBF800000 for GL_INT type passed to glVertexAttribPointer and 0x4F800000 for GL_UNSIGNED_INT type passed to glVertexAttribPointer.
The problem seems to come from the shader color input variable which is directly transmitted to the output variable, because if I enter the output variable with -1, this time I have the expected value 0xFFFFFFFF to recovery.
This is what the relevant part of the shader looks like:
#version 330 core
in vec3 vertexStart;
in vec3 vertexEnd;
in int colorStart;
in int colorEnd;
out vec3 p0;
out int c0;
out vec3 p1;
out int c1;
out vec3 p2;
out int c2;
out vec3 p3;
out int c3;
...
void main() {
...
c0 = colorStart;
c1 = colorEnd;
c2 = colorEnd;
c3 = colorStart;
}
I therefore suspect that it is in the colorStart and colorEnd input variables that the problem manifests itself since if I directly affect the output value (for example “c0 = -1;” instead of “c0 = colorStart;” ) it is this value that is recovered.
The code executed is strictly the same on both platforms since it is in Java.
The versions are OpenGL 3.3, Windows 11 and Macos 13.
I’ve already spent some time on this problem without finding an explanation, much less a solution.