I’m deseperately trying to use an integer texture in my application, to retrieve pure integer values in the fragment shader.
I’ve created a 2D texture, and writing data to it with the following parameters passed to TexImage2D and TexSubImage2D:
internalformat = GL_RGB8UI format = GL_RGB_INTEGER type = GL_UNSIGNED_BYTE
Then, I’ve bound the texture to a uniform usampler in my shader, and I try to retrieve texels from it using:
uvec4 value = texelFetch(sampler, ivec2(...), 0);
However, the value retrieved are always set to 0, whatever I can put in my texture.
I’ve checked using multiple ways if the data stored in the texture was correct, reading data back from the texture, using gDEbugger, etc… and I’ve got no error raised by OpenGL, so I think I’m doing it right.
Also I’m calling GenerateMipmap after every load, if it could ever have any impact.
For technical details, I’m running under GNU/Linux/Debian. Gl info is:
vendor: NVIDIA Corporation renderer: GeForce 9800M GTS/PCI/SSE2 version: 3.3.0 NVIDIA 290.10 shading_language_version: 3.30 NVIDIA via Cg compiler
And I’m using a manually created GLX context, asked with explicit OpenGL 3.3 Core profile support.
If anybody has any idea of what I do wrong… it would be awesome.
Of course, the exact same code, done with floating texture and floating sampler (with the exact same bytes loaded up) works perfectly.