Framebuffer Integer Texture Attachment

hi, i’m trying to setup a framebuffer with several texture attachments
2 of those texture attachments should store some integer values (like materialID etc.), but i dont get it right
i want to read back to the application the current framebuffer values at the cursor position, that works if i use vec4 as fragment shader output and GL_RGBA as texture format

the problem is: how can i read back integer values correctly ???

the fragment shader is trivial:


layout (location = 0) out vec4		Fragment0;
layout (location = 1) out int			Fragment1;
layout (location = 2) out int			Fragment2;


void main(void)
{
	
	Fragment0 = vec4(1, 0, 0, 1);
	Fragment1 = 2;    // any constant integer value, for testing
	Fragment2 = 500;

}

the framebuffer setup:
i have 3 color attachments, the 1st is just a rgba texture for color
the 2nd/ 3rd texture are those integer textures
the 4th is a depth texture


// initialize 4 textures for framebuffer:
// set filter parameters
for (unsigned int i = 0; i < 4; i++)
{
	glBindTexture(GL_TEXTURE_2D, framebuffer.textures[i]);
	glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
	glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
	glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
	glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
	glBindTexture(GL_TEXTURE_2D, 0);
}

// allocate memory:

// color texture
glBindTexture(GL_TEXTURE_2D, framebuffer.textures[0]);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, 0);

// objectID texture
glBindTexture(GL_TEXTURE_2D, framebuffer.textures[1]);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RED, width, height, 0, GL_RED, GL_INT, 0);

// faceID texture
glBindTexture(GL_TEXTURE_2D, framebuffer.textures[2]);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RED, width, height, 0, GL_RED, GL_INT, 0);

// depth texture
glBindTexture(GL_TEXTURE_2D, framebuffer.textures[3]);
glTexImage2D(GL_TEXTURE_2D, 0, GL_DEPTH_COMPONENT, width, height, 0, GL_DEPTH_COMPONENT, GL_FLOAT, 0);

glBindTexture(GL_TEXTURE_2D, 0);

// attach to framebuffer
glBindFramebuffer(GL_FRAMEBUFFER, framebuffer);
glFramebufferTexture(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0,	framebuffer.textures[0], 0);
glFramebufferTexture(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT1,	framebuffer.textures[1], 0);
glFramebufferTexture(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT2,	framebuffer.textures[2], 0);
glFramebufferTexture(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT,	framebuffer.textures[3], 0);

when rendering:
i’m binding my framebuffer, set the drawbuffers, and render a simple triangle with the shader
when using for each color attacjment the same rgba format (aand vec4 as fragmentshader output), it works
but my problem is that i cant draw integers into those texture, i dont know what internal / external format to use and what arguments to pass to the function glReadPixels(…) :frowning:


// render:
glBindFramebuffer(GL_FRAMEBUFFER, framebuffer);

unsigned int drawbuffers[] = {
		GL_COLOR_ATTACHMENT0, 		// color
		GL_COLOR_ATTACHMENT1, 		// objectID
		GL_COLOR_ATTACHMENT2, 		// faceID
};
glDrawBuffers(sizeof(drawbuffers) / sizeof(unsigned int), drawbuffers);

// clear buffers
glClearColor(0, 0, 0, 0);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);


	// render
	glUseProgram(program);
	glBindVertexArray(vertexarray);
	glDrawArrays(GL_TRIANGLES, 0, 3);
	glBindVertexArray(0);
	glUseProgram(0);


	
	// read all 3 pixels from the color attachments
	int pixel0 = 0, pixel1 = 0, pixel2 = 2;

	int x = Main::Instance().cursor()->Position().x;
	int y = Main::Instance().cursor()->Position().y;

	glReadBuffer(GL_COLOR_ATTACHMENT0);
	glReadPixels(x, y, 1, 1, GL_RGBA, GL_UNSIGNED_BYTE, &pixel0);

	glReadBuffer(GL_COLOR_ATTACHMENT1);
	glReadPixels(x, y, 1, 1, GL_RGBA, GL_UNSIGNED_BYTE, &pixel1);

	glReadBuffer(GL_COLOR_ATTACHMENT2);
	glReadPixels(x, y, 1, 1, GL_RED, GL_INT, &pixel2);

	std::cout << pixel0 << "       " << pixel1 << "       " << pixel2 << std::endl;

That creates a normalised texture (with an unspecified number of bits). For an integer texture, you need e.g.


glTexImage2D(GL_TEXTURE_2D, 0, GL_R8I, width, height, 0, GL_RED_INTEGER, GL_INT, 0);

(or GL_R16I or GL_R32I, depending upon the desired number of bits).

The type parameter specifies the type of the data used to initialised the texture. It has no effect upon the texture’s internal format. If the data parameter is null (and you aren’t reading from a pixel buffer object), the type parameter is largely ignored (but not entirely; if type refers to one of the packed formats, it must have the correct number of components for format even if data is null).

I have a similar thing, besides doing differed shading I also render object’s assigned value to an extra texture (to know what object I look at).

I use this function to read color from the framebuffer:

	// Get the color from mouse
		unsigned char pixelRGBarray[4] = { 0 };
		unsigned int * returnControlValue = (unsigned int *) &pixelRGBarray[0];

		if (this->ownerWindow->windowControlUnit->cuCursorEnabled == 1) { 
			glReadPixels(ownerWindow->mouseX, (ownerWindow->height - ownerWindow->mouseY - 1), 1, 1, GL_RGBA, GL_UNSIGNED_INT_8_8_8_8_REV, &pixelRGBarray[0]); // Take control value from cursor position 
		} else {
			glReadPixels(ownerWindow->width/2, ownerWindow->height/2, 1, 1, GL_RGBA, GL_UNSIGNED_INT_8_8_8_8_REV, &pixelRGBarray[0]); // Take control value from center of screen
		}

		pixelRGBarray[3] = 0;

	// Checking the list of control objects
		if ((*returnControlValue) != 0)
		{
			if (this->listOfControlObjects.count(*returnControlValue) == 1) {
				*(this->listOfControlObjects.at(*returnControlValue)->controlRegisterAddr) = 1;
			}
		}

Here I basically create an array of 4 chars and and also treat it as an int, then I load the value from the framebuffer at some pixel location, set alpha to 0 (because it will offset the integer that I use as object ID).

Keep in mind:
You have to use 8888_REV format (I’m pretty sure it’s Windows thing)
You can only read pixel from 1st texture (not sure if 0 attachment or just 1st activated) that is attached to your framebuffer,
Floats are accurate enough to represent 256 levels of a color, so you will not have any issues loading an int value and reading it from from the buffer.
Make sure you don’t apply

Also I’m pretty sure the “integer textures” are only integer textures until you load them into OpenGL, the system will then convert them into RGBA. So if you give the system a texture with 1 char per pixel, it will give you RGBA texture where RGB pixels are equal.

Also I’m pretty sure the “integer textures” are only integer textures until you load them into OpenGL, the system will then convert them into RGBA. So if you give the system a texture with 1 char per pixel, it will give you RGBA texture where RGB pixels are equal.

Neither of these statements are true. Your code simply doesn’t create proper integer textures.

thanx for your answers, especially to GClements :slight_smile: it works now !

to initialize the texture for 32bit integers:


glTexImage2D(GL_TEXTURE_2D, 0, GL_R32I, width, height, 0, GL_RED_INTEGER, GL_INT, 0);

to read back drawn 32bit integers:


int pixel = 0;
glReadBuffer(GL_COLOR_ATTACHMENT1); // whereever the texture is attached to ...
glReadPixels(x, y, 1, 1, GL_RED_INTEGER, GL_INT, &pixel);

ps: @ CaptainSnugglebottom
in previous days i did also use a little workaround, i converted all integers (4 byte-sized components) to vec4 color values and finally read just 4 GL_UNSIGNED_BYTE (format: GL_RGBA), that does work to