Question about values of GL_RGBA32UI?

Hello guys~
In my application, I need to store 32-bits integer numbers in a RGBA texture. The maximum number of unsigned integer should be 4,294,967,295.

I create a texture through:
Code:
glTexImage2D(GL_TEXTURE_RECTANGLE_ARB,0,GL_RGBA32UI,texSize,texSize,0,GL_RGBA_INTEGER,GL_UNSIGNED_INT,data);

where ‘data’ is the source unsigned integer array.

And I read back the texture data through:
Code:
glReadPixels(0, 0, texSize, texSize,GL_RGBA_INTEGER,GL_UNSIGNED_INT,result);

Then print out ‘result’ array, I can get the right number( exactly same to ‘data’ array).

THE PROBLEM IS:
If I use:
Code:
glClearColor(1, 1, 1, 1);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

And read back the data again, I get a series of ‘1065353216’.

Or make this problem more clear: If I use glClear() or glColor4ui() to render to the texture, the maximum number will be ‘1065353216’ but not ‘4,294,967,295’.
Why this happened? And how to solve this?
Thank you very much!

And the whole code will be paste as follows.

The code is very simple:

Code:
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include <GL/glew.h>
#include <windows.h>
#include <GL/glut.h>

typedef unsigned char BYTE;

int main(int argc, char **argv) {
int texSize = 4;

int i;

//to store read back data
GLuint result[4texSizetexSize];

//to store source data
GLuint* data = (GLuint*)malloc(4*texSize*texSize*sizeof(GLuint));
for (i=0; i&lt;texSize*texSize*4; i++)
{
    data[i] = 1; //whatever values
}

glutInit (&argc, argv);
glutCreateWindow("TEST1");
glewInit();

glMatrixMode(GL_PROJECTION);
glLoadIdentity();
gluOrtho2D(0.0,texSize,0.0,texSize);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glViewport(0,0,texSize,texSize);

// Create FBO
GLuint fb;
glGenFramebuffersEXT(1,&fb);
glBindFramebufferEXT(GL_FRAMEBUFFER_EXT,fb);

//create texture buffer
GLuint fboTex;

glGenTextures (1, &fboTex);
glBindTexture(GL_TEXTURE_RECTANGLE_ARB,fboTex);
glTexParameteri(GL_TEXTURE_RECTANGLE_ARB,GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_RECTANGLE_ARB,GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_RECTANGLE_ARB,GL_TEXTURE_WRAP_S, GL_CLAMP);
glTexParameteri(GL_TEXTURE_RECTANGLE_ARB,GL_TEXTURE_WRAP_T, GL_CLAMP);
glTexImage2D(GL_TEXTURE_RECTANGLE_ARB,0,GL_RGBA32UI,texSize,texSize,0,GL_RGBA_INTEGER,GL_UNSIGNED_INT,data);

//create render buffer
GLuint rboId;
glGenRenderbuffersEXT(1, &rboId);
glBindRenderbufferEXT(GL_RENDERBUFFER_EXT, rboId);
glRenderbufferStorageEXT(GL_RENDERBUFFER_EXT, GL_DEPTH_COMPONENT, texSize, texSize);

//bind FBO with render buffer and texutre buffer
glFramebufferTexture2DEXT(GL_FRAMEBUFFER_EXT,GL_COLOR_ATTACHMENT0_EXT,GL_TEXTURE_RECTANGLE_ARB,fboTex,0);
glFramebufferRenderbuffer(GL_FRAMEBUFFER,GL_DEPTH_ATTACHMENT,GL_RENDERBUFFER,rboId);


glClearColor(1, 1, 1, 1);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

// Read back the texture data
glReadBuffer(GL_COLOR_ATTACHMENT0_EXT);
glReadPixels(0, 0, texSize, texSize,GL_RGBA_INTEGER,GL_UNSIGNED_INT,result);

//out put data
for (i=0; i&lt;texSize*texSize*4; i++)
    printf("%u

",result[i]);

free(data);
glDeleteFramebuffersEXT (1,&fb);

glDeleteTextures(1,&fboTex);

return 0;

}

Or make this problem more clear: If I use glClear() or glColor4ui() to render to the texture, the maximum number will be ‘1065353216’ but not ‘4,294,967,295’.

You asked for a non-normalized color format. RGBA32UI is an unsigned integral format. Which means that you aren’t supposed to get MAX_UINT when you give a color of 1.0f.

Now, I’m not sure how a floating-point color is supposed to be converted into an integral format for rendering. I’m guessing by the look of the number that your implementation is simply converting the bit-pattern of the floating-point color into an integer. So you’re getting what you would get if you did a “reinterpret_cast<unsigned int>(1.0f);”

What you really want is glClearBufferuiv. This function takes uints, which will allow you to clear a uint buffer to the color you actually want.

Thank you for your excellent answer Alfonse. Now I know the reason.

In my application, I want to render integer numbers into texture, and the number will be pretty huge(maybe millions), that’s the reason I used 32-bit uints.

Actually, the numbers are IDs of every polygon in the scene. I want to store some of these IDs in a texture channel and then do some calculation in GPU via shaders.

So, do you have some ideas to render uint colors into texture (not just clear colors with glClearBufferuiv)?

Thanks again!

I’ve searched for a whole day and had nothing…

Read the EXT_gpu_shader4 and EXT_texture_integer specs again.

The shader needs to output a uvec4, not a vec4.

Hi, thanks very much for reference.

So I have to use shader to achieve that?
And I cannot get it right by vertex color and default render pipe-line?

So I have to use shader to achieve that?
And I cannot get it right by vertex color and default render pipe-line?

Yes, for both.

Ok, thank you all guys very much! And best regards!