I am trying to access an integers 2D lookup table from vertex shader according to code from the web, but can’t get any meaningful values.
The lookup tabke is defined as a texture 2D following:
glTexImage2D(GL_TEXTURE_2D, 0, GL_ALPHA16I_EXT, 16, 256, 0, GL_ALPHA_INTEGER_EXT, GL_INT, &tex);
In the shader I define an integer sampler:
uniform isampler2D intTex;
and try to read a value by:
int read(int i, int j)
return texelFetch2D(intTex, ivec2(j, i), 0).a;
The code compiles and links, yet I don’t get the expected value.
I suspect that what I do is not according to the spec. Not sure if it’s possible to access integer texture
with integer coordinates (quite confused reading the spec…).
Does your implementation support EXT_texture_integer? Which version of OpenGL does it support?
Note that while most of that extension was incorporated into core in OpenGL 3.0, support for integer alpha, luminance and intensity textures (e.g. GL_ALPHA16I_EXT) wasn’t included. If you want a single-channel texture in OpenGL 3.0, you use the red channel (e.g. GL_R16I).
The GLSL side is covered by EXT_gpu_shader4, which was incorporated into GLSL 1.30.
Integer and unsigned integer functions of all the texture lookup functions described in this section are also provided, except for the “shadow” versions, using function overloading. Their prototypes, however, are not listed separately. These overloaded functions use the integer or unsigned-integer versions of the sampler types and will return an ivec4 or an uvec4 respectively, except for the “textureSize” functions, which will always return an integer, or integer vector.
IOW, you can use the texelFetch* functions with integer samplers.