After a night of debugging a GLSL demo program in Linux,
I finally found the very strange cause of the error:
The function glUniform1f() seems to be broken, as
well as its ARB counterpart. If i retrieve this
function pointer through the extension mechanism,
it works OK. However, I don’t need to fetch anything
from OpenGL 2.1 for my program to compile and run,
it all seems to be included by default, and every
other function seems to work fine. It’s just this
single function glUniform1f() that won’t work as intended.
The error is that the function silently writes a bad
value to the float parameter pointed to. The value
in the GLSL program is garbage, and so is the value
retrieved by glGetUniformfv() for that parameter.
The very similar function glUniform1i() works fine,
as does everything else related to shaders.
The error occurs for ATI and NVidia GPUs alike, and
on Ubuntu 9.04 and 9.10. I have not tested it on any
other Linux distribution. On MacOS X the function seems
to work fine. In Windows the point is moot as I have
to fetch almost every single function pointer by the
extension mechanism anyway.
I use GLFW for the window handling, but nothing else,
i.e. no extension loader like GLEW or glee.
I suspect a simple cut-and-paste typo in some part of
the underlying library code or bindings for OpenGL in
Linux, but I don’t even know where to begin to look.
Can anyone give me a hint on where I should look to
find and hopefully fix the error?