glBindImageTexture with GL_TEXTURE_2D_ARRAY


I’m trying to use image load store with a GL_TEXTURE_2D_ARRAY. I created a 2D GL_RGB8 texture array and try to bind them to the image unit. I have bound a 1D texture to image unit before and it works fine.

However, I could not get this texture array to work and glGetError returns Invalid Value. As far as I read the documentation, I checked the function parameters and none of them is violating the rules.
Here is the function call

//GL_MAX_IMAGE_UNITS on my GTX1080 is 8
//camera_texture_id value seems reasonable (like 26)
glBindImageTexture(3, camera_texture_id, 0, GL_TRUE, 0, GL_READ_WRITE, GL_RGB8);

and here is a texture creation

//uniform declaration in shader
layout(rgba8) uniform image2DArray camera_tex;
//Note: none of these line generates an error
GLuint camera_texture_id;
glGenTextures(1, &camera_texture_id);
glBindTexture(GL_TEXTURE_2D_ARRAY, camera_texture_id);

I am completely out of idea right now of why it generate that error.
Maybe someone can help me?

Thank you in advance.

The issue is the RGB8. As far as I know image access only works on R, RG, and RGBA values. In general its a good idea to not use RGB at all except for special formats that natively align in memory.

Thanks for your reply.

I change the format from RGB to RGBA and the error is gone.

However, I encounter a new problem which is the memory issue. The array texture that I’m trying to create has like 13xx * 15xx * 240 resolution which is sum up to around 2.12GB of GPU memory (according to Nsight). And of course Nsight give me an out or memory error when I try to view them. Currently I’m trying the compressed texture format but I’m not sure that the compressed format will work with image load store with GL_READ_WRITE access or not (I do write to it in shader). If you guys have any suggestion I would glad to hear them.

Compressed formats are not supported.

So all R, RG, RGBA and the special formats GL_R11F_G11F_B10F, GL_RGB10_A2, GL_RGB10_A2UI are supported.