Fragment out uint with GL_R8UI texture

Hi all!

Trying out some features of 3.x which is really neat.

The problem i’m having currently is that i get weird values when i read back the texture with type GL_RED (GL_R8UI) when i in the shader have declared it as

out uint var;

and set it with

var = uint(128);

for example, but the values i read back are a bit random.

The shader has another output of vec3 that works perfectly, glGetFragDataLocation reports the index of 0 and 1 for them correctly. The framebuffer is valid since it is complete and the vec3 buffer displays correctly. I also disabled ClampColor for GL_READ_COLOR and i’ve set GL_PACK_ALIGNMENT to 1.

So my question is, should i fall back to using a float instead? I really would like to use a single integer here cause it is more suitable for the calculations im doing (It’s not game related)

Thanks in regards

EDIT: Floats didn’t work much better though, using glReadPixels with the framebuffer bound did work better than reading from the texture, the values we’re consistent. They did not match the values in the shader though, so some sort of clamping/scaling is done.

I did find this thread talking about bugs in the driver:

I don’t know if this helps, but… in your glTexImage?D() call, it looks like you are using GL_R8UI for the internalFormat parameter. Then I think you also need to use something like GL_RED_INTEGER for the format parameter rather than GL_RED, and GL_UNSIGNED_BYTE for the type parameter.

Oh, right! I forgot to tell about that, i get an error if i use GL_RED, GL_RED_INTEGER is the correct enum to use of course :slight_smile:

I think there’s a bug in the NVIDIA drivers. I tried with a vec4 texture and the application crashed in the driver.

The readPixels work with the vec4 texture when the framebuffer is bound though. So i am able to get the output now, case closed :slight_smile: