I need a 64 bit of data cube (packed) that I need to transfer to GPU to do bitwise operations on.
Would this be correct texture creation?
glTexParameteri(GL_TEXTURE_3D, GL_TEXTURE_MIN_FILTER, GL_NEAREST); glTexParameteri(GL_TEXTURE_3D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexImage3D( GL_TEXTURE_3D, 0, GL_RGBA16UI_EXT, BW, BH, BD, 0, GL_RGBA, GL_UNSIGNED_INT, data );
What I don’t quite understand is what happens if I do this in GLSL?
uint alpha = texture3D( 3dtexture, vec3( 0.1, 0.1, 0.1 ) ).a;
What is the size of is ‘alpha’? Is it a 16 bit unsigned int? That doesn’t make sense. Does OpenGL convert the data used while creating the texture to a 32 bit uint? If so how can I get the last 10 bits?