My data was saved in an array:
GLuint *triIndex = new GLuint[COUNT];
because COUNT is such a large number that I cannot define a 1D texture the same size as it, so instead I use a 2D texture.
I define the texture like the following:
glGenTextures(1, &triIndex_tex);//an error here
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB8UI, 512, 312,
0, GL_RGB, GL_UNSIGNED_INT, triIndex);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
but an error occurs saying “invalid operation”. How to solve this?
It might has something to do with glTexImage2D(). And neither GL_UNSIGNED_INT nor GL_UNSIGNED_BYTE works.
If you want to use integer internal formats, read EXT_texture_integer carefully.
Then replace GL_RGB with GL_RGB_INTEGER, and GL_LINEAR with GL_NEAREST.
Though I have difficulty in understanding the article you mentioned, but the replacement works! Thank you very much.
Sorry, one more question…
It seems that I cannot sample this texture in the shader.
I use the following statement:
value = texture(myTexture, index).rgb;
but value is always zero. Is there any possibility that GLSL cannot recognize the format of GL_RGB_INTEGER?
You’ll need to use an integer sampler to read defined data from an integer texture.
Like the documentation says.
I suppose the index you use still has to be normalized in the range 0 to 1, or you will get the clamped values outside of the texture?
Consider using GL_TEXTURE_RECTANGLE instead GL_TEXTURE_2D, and you can use non normalized indices.
You can also use texelFetch to index with integer arguments. I think this works for all sampler types.