Sampling bit fields from a texture in a frag shader

I’m trying to query a texture’s denormalized values in a glsl frag shader to use as a bit field (32 bits with 8 bits in each of rgba).

I’m using texelFetch (also tried texture() with NEAREST filtering).

With a usampler2D and GL_RGBA8UI internalformat the sampled RGB values are always 0, A seems to get set to gibberish.

I then tried GL_RGBA8 internalformat with a sampler2D and denormalized it myself. That worked.

Questions:

  1. Is this the right way to go about this in the first place? The documentation states that internalformat is only a request, I guess I’ll have to test all of the hardware I’m trying to support?
  2. Why is the uint internalformat behaving so strangely?

Ok. So you mean query the integral (non-normalized) values in the texture. Denormalized numbers are something else.

Ok. By “denormalized it myself”, you can also use the floatBitsToUint() function for that.

That’s odd. Yes, afaik this is the right way. See Sampler (GLSL). There may be another bug in your code (possibly relating to how you are populating that GL_RGBA8UI texture) that’s causing these issues. Have you been Checking for OpenGL Errors?

Feel free to post a short stand-alone test program that reproduces the problem, and I’m sure someone will be able give you some feedback and possibly try it on their graphics drivers.

1 Like

When you populate the texture with data (glTex[Sub]Image2d etc), are you using GL_RGBA_INTEGER as the (external) format? Using GL_RGBA will treat the data as normalised.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.