GLSL texture data format

I defined the texture like the following:

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB32UI, 512, 312, 

so you see each color channel of a texel is of type unsigned int. However, when I sample the texture in the shader using the following statement:

uvec3 value;
value = texture(myTexture, index).rgb;

an error generated saying implicit cast from “vec3” to “uvec3”. What’s the problem here?

Declare the sampler as “uniform usampler2D myTexture”

The problem is that you try to cast from a signed to an unsigned data type. texture() returns a vec4, you swizzle to a vecThrizzle (sorry for that) and then try to assign it to a uvec3.

If you want unsigned values, use a sampler like proposed by Ludde.