Problem with using multiple offscreen framebuffers

I want to use multiple framebuffers sequentially. (For example, render scene into first buffer → apply HDR shader → blit first and second buffers → apply FXAA shader → send to renderbuffer).

Here’s what I do in code:

unsigned int hdrFBO;
unsigned int fxaaFBO;
unsigned int RBO;

InitializeBuffers();

loop
{
//Bind first buffer
glBindFramebuffer(GL_FRAMEBUFFER, hdrFBO);

//Clear screen
glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

//Render start
glEnable(GL_DEPTH_TEST);
glViewport(0, 0, 624, 480);
DrawScene();
glDisable(GL_DEPTH_TEST);
//Render finish

//Apply post-process shader
hdrShader.use(); // <- 1

//Unbind first buffer
glBindFramebuffer(GL_FRAMEBUFFER, 0);

// blit buffers
glBindFramebuffer(GL_READ_FRAMEBUFFER, hdrFBO);
glBindFramebuffer(GL_DRAW_FRAMEBUFFER, fxaaFBO);
glBlitFramebuffer(0, 0, 624, 480, 0, 0, 624, 480, GL_COLOR_BUFFER_BIT, GL_NEAREST);

//bind second buffer
glBindFramebuffer(GL_FRAMEBUFFER, fxaaFBO);
//Apply anti-aliasing shader
fxaaShader.use(); // <- 2
//unbind second buffer
glBindFramebuffer(GL_FRAMEBUFFER, 0);

// blit to renderbuffer
glBindFramebuffer(GL_READ_FRAMEBUFFER, fxaaFBO);
glBindFramebuffer(GL_DRAW_FRAMEBUFFER, RBO);
glBlitFramebuffer(0, 0, 624, 480, 0, 0, 624, 480, GL_COLOR_BUFFER_BIT, GL_NEAREST);
glBindFramebuffer(GL_FRAMEBUFFER, 0);

//draw post-processed screen quad
glBindVertexArray(quadVAO);
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D,  RBO); // use the now resolved color attachment as the quad's texture
glDrawArrays(GL_TRIANGLES, 0, 6);
}

With two framebuffers - only the last post-process shader is applied (2).
When I use only one of these buffers with renderbuffer everything works correctly.

As I understand it, the blit function should send information from the first buffer to the second and after that, all shaders after the function “glBindFramebuffer(GL_READ_FRAMEBUFFER, fxaaFBO);” will be applied to the already post-processed hdr frame from first framebuffer.
The problem is that applying the fxaa shader seems to overwrite the frame while ignoring the hdr frame, resulting frame with anti-aliasing, but without hdr.
If I remove the “fxaaShader.use();” line, the renderbuffer gets the frame with the applied hdr from the first buffer.

I don’t understand where the problem is, can somebody help me?

Shaders don’t get applied to framebuffers. A rendering operation writes to a framebuffer, but it writes to the draw framebuffer, not the read framebuffer. But since you rebound fxaaFBO to just GL_FRAMEBUFFER (which binds to both draw and read), that’s not your problem.

But my first sentence seems to be applicable here: shaders don’t get applied to framebuffers. You can use a shader to render to a framebuffer. But unless fxaaShader.use() invokes a rendering command (and if it does, the name is horribly misnamed), you never actually render with the shader. So obviously, nothing in fxaaFBO gets rendered to.

It’s also unclear exactly what the point of the blit operation is. You render to one or more textures in hdrFBO. You should then presumably perform a rendering operation using those textures. But you never do that, as far as I can tell.

.use() function just calls .glUseProgram()

Do I have to render the screen quad before every .use () shader?

EDIT: I’ve managed to get it working by binding fbo and drawing screen quad after each shader .use(), but now it causes some weird artifacts?/flickering.
Screenshot_427

You’ve misunderstood. Shaders define how rendering works. If you say “use this program”, that doesn’t really do anything until you subsequently say “render”. That rendering operation will use the program currently in use.

It’s like setting a variable but never reading from it.

Okay, I think I got it now.

I believe it’s the somewhat right order:

{
//Select first buffer
glBindFramebuffer(GL_FRAMEBUFFER, hdrFBO);

//Clear
glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

//Render scene
Render();

//Select first shader
hdrShader.use();

//Render screen quad with first shader (offscreen)
glBindVertexArray(quadVAO);
glBindTexture(GL_TEXTURE_2D, hdrFBO);
glDrawArrays(GL_TRIANGLES, 0, 6);

//Blit buffers
glBindFramebuffer(GL_READ_FRAMEBUFFER, hdrFBO);
glBindFramebuffer(GL_DRAW_FRAMEBUFFER, fxaaFBO);
glBlitFramebuffer(0, 0, 624, 480, 0, 0, 624, 480, GL_COLOR_BUFFER_BIT, GL_NEAREST);

//Select second buffer
glBindFramebuffer(GL_FRAMEBUFFER, fxaaFBO);

//Select second shader
fxaaShader.use();

//Render screen quad with second shader (offscreen)
glBindTexture(GL_TEXTURE_2D, fxaaFBO);
glDrawArrays(GL_TRIANGLES, 0, 6);

glBindFramebuffer(GL_READ_FRAMEBUFFER, fxaaFBO);
glBindFramebuffer(GL_DRAW_FRAMEBUFFER, RBO);
glBlitFramebuffer(0, 0, 624, 480, 0, 0, 624, 480, GL_COLOR_BUFFER_BIT, GL_NEAREST);

//Select/use screen buffer
glBindFramebuffer(GL_FRAMEBUFFER, 0);

//Draw screen quad
glBindVertexArray(quadVAO);
glBindTexture(GL_TEXTURE_2D,  RBO);
glDrawArrays(GL_TRIANGLES, 0, 6);
}

The buffer chain seems to work, quad is rendered using all shaders, though I’m pretty sure that I’m doing something wrong on the first two “rendering screen quad” steps. Artifacts still occur showing quad vertices. Maybe it’s somehow drawing multiple screen quads?
Shouldn’t two first “glDrawArrays(GL_TRIANGLES, 0, 6);” render offscreen?

There is no way that both of these calls can work. Either fxaaFBO is a framebuffer object or it is a texture object.

In fact, I see several places where you are binding something you think is an “FBO” as a texture. You can’t do that; FBOs and textures are not the same thing. You attach textures to FBOs, but they are ultimately separate objects.

Finally, it’s working!
I had to bind textures with glBindTexture (instead of fbo’s) and clear depth buffer after rendering the scene.
Everything works and the artifacts have disappeared.

Thanks for explaining shaders and fbo.