How do you make and display a 16 bit unsigned integer RGBA image in OpenGL?

I am learning OpenGL and have successfully created and displayed a GL_RGBA32F texture. Now I’m trying to create a GL_RGBA16UI texture, fill it with the color red in a compute shader and then display it on a screen quad. All I get is a black window and I have no idea what is wrong.

C++ snippet:

unsigned int texture;

glBindTexture(GL_TEXTURE_2D, texture);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA16UI, 512, 512, 0, GL_RGBA, GL_UNSIGNED_SHORT, NULL);

glBindImageTexture(0, texture, 0, GL_FALSE, 0, GL_READ_WRITE, GL_RGBA16UI);

Compute Shader:

#version 430 core

layout (local_size_x = 1, local_size_y = 1, local_size_z = 1) in;

// image
layout (rgba16ui, binding = 0) uniform uimage2D img;

void main() {
    ivec2 texelCoord = ivec2(gl_GlobalInvocationID.xy);
	uvec4 texCol = uvec4(65535, 0, 0, 65535);
    imageStore(img, texelCoord, texCol);
}

Fragment Shader:

#version 430 core
out vec4 FragColor;
	
in vec2 TexCoords;
	
uniform usampler2D tex;
	
void main()
{             
    uvec4 texCol = texture(tex, TexCoords);
    FragColor = texCol;
}

Thank you in advance.

This will cast the integer values to floating-point values. So an integer of 32768 becomes 32768.0f. You probably intended to normalize those values. That requires division by 65535.

I tried it, it’s still not working.
This is what I did:

FragColor = texCol / 65535;

OK, I probably should have been more clear. You need to divide by 65535.0. You need to force floating-point math so that it doesn’t do integer division.

Your debugger should be throwing INVALID_OPERATION here, and you should see that the texture geometry is incomplete. Because you didn’t use GL_RBGA_INTEGER.

This is yet another common problem avoided by using glTexStorage.

That’s still not working. I tried checking if the texture is even being written too with this code in the fragment shader:

vec4 color;
if (texCol.r != 65535)
    color = vec4(1.0, 0.0, 0.0, 1.0);

FragColor = color;

What I’m doing here is checking if the red channel is 65535 and if not, the fragment shader should display a red color and indeed it does. I’m not sure how to interpret this, maybe the compute shader is not working properly or the fragment shader is not properly receiving the texture.

I changed the format to GL_RGBA_INTEGER and it’s still not working. I will try to implement a debugger maybe that will make things clearer.

I kind of solved it. What I did was make a second texture one of GL_RGBA32F and used a second compute shader to transfer the pixel data and then display the second texture.

Second shader code:

#version 430 core

layout (local_size_x = 1, local_size_y = 1, local_size_z = 1) in;

// images
layout (rgba16ui, binding = 0) uniform uimage2D original;
layout (rgba32f, binding = 1) uniform image2D img;

void main() {
	ivec2 texCoord = ivec2(gl_GlobalInvocationID.xy);
	uvec4 texel = imageLoad(original, texCoord);
	
	vec4 color = texel / 65535.0;
	
	imageStore(img, texCoord, color);
}

This feels like a hack though so if you have a better solution please say so.