I’m trying to query a texture’s denormalized values in a glsl frag shader to use as a bit field (32 bits with 8 bits in each of rgba).
I’m using texelFetch (also tried texture() with NEAREST filtering).
With a usampler2D and GL_RGBA8UI internalformat the sampled RGB values are always 0, A seems to get set to gibberish.
I then tried GL_RGBA8 internalformat with a sampler2D and denormalized it myself. That worked.
- Is this the right way to go about this in the first place? The documentation states that internalformat is only a request, I guess I’ll have to test all of the hardware I’m trying to support?
- Why is the uint internalformat behaving so strangely?