I’m trying to use image load store with a GL_TEXTURE_2D_ARRAY. I created a 2D GL_RGB8 texture array and try to bind them to the image unit. I have bound a 1D texture to image unit before and it works fine.
However, I could not get this texture array to work and glGetError returns Invalid Value. As far as I read the documentation, I checked the function parameters and none of them is violating the rules.
Here is the function call
//GL_MAX_IMAGE_UNITS on my GTX1080 is 8 //camera_texture_id value seems reasonable (like 26) glBindImageTexture(3, camera_texture_id, 0, GL_TRUE, 0, GL_READ_WRITE, GL_RGB8);
and here is a texture creation
//uniform declaration in shader layout(rgba8) uniform image2DArray camera_tex; . . //Note: none of these line generates an error GLuint camera_texture_id; glActiveTexture(GL_TEXTURE3); glGenTextures(1, &camera_texture_id); glBindTexture(GL_TEXTURE_2D_ARRAY, camera_texture_id); glTexStorage3D(GL_TEXTURE_2D_ARRAY, 1, GL_RGB8, TEXTURE_WIDTH, TEXTURE_HEIGHT, 240);
I am completely out of idea right now of why it generate that error.
Maybe someone can help me?
Thank you in advance.