glUniform1f() seems broken - where is the error?

After a night of debugging a GLSL demo program in Linux,
I finally found the very strange cause of the error:

The function glUniform1f() seems to be broken, as
well as its ARB counterpart. If i retrieve this
function pointer through the extension mechanism,
it works OK. However, I don’t need to fetch anything
from OpenGL 2.1 for my program to compile and run,
it all seems to be included by default, and every
other function seems to work fine. It’s just this
single function glUniform1f() that won’t work as intended.

The error is that the function silently writes a bad
value to the float parameter pointed to. The value
in the GLSL program is garbage, and so is the value
retrieved by glGetUniformfv() for that parameter.
The very similar function glUniform1i() works fine,
as does everything else related to shaders.

The error occurs for ATI and NVidia GPUs alike, and
on Ubuntu 9.04 and 9.10. I have not tested it on any
other Linux distribution. On MacOS X the function seems
to work fine. In Windows the point is moot as I have
to fetch almost every single function pointer by the
extension mechanism anyway.

I use GLFW for the window handling, but nothing else,
i.e. no extension loader like GLEW or glee.

I suspect a simple cut-and-paste typo in some part of
the underlying library code or bindings for OpenGL in
Linux, but I don’t even know where to begin to look.
Can anyone give me a hint on where I should look to
find and hopefully fix the error?

I am on Ubuntu too. Could you post your code? I would like to see if I can replicate the problem on my machine.

I tried a quick check on my machine with my own code and the following worked fine


glUniform1f(glGetUniformLocation(shaderprogram, "zzz"), 0.99);

with lines in my fragment shader


uniform float  zzz;
...
color = vec4(vec3(zzz,zzz,zzz),1.0);

by observing the effect of changing from 0.99 to 0.1 and seeing yes the color changed accordingly.

The difference may be the fact that I used the following instead of gl.h.


// Ensure we are using opengl's core profile only
#define GL3_PROTOTYPES 1
#include <GL3/gl3.h>

...
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MAJOR_VERSION, 3);
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MINOR_VERSION, 2);

I know this makes it gl3 but it may give some clue.

Note some of my configuration info is
Ubuntu 9.10 32bit
GLEW version 1.5.1
Reporting capabilities of display :0.0, visual 0x2b
Running on a GeForce 9600 GT/PCI/SSE2/3DNOW! from NVIDIA Corporation
OpenGL version 3.2.0 NVIDIA 190.18.05 is supported

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.