FBO depth buffer confusion

Hi, I want to pass depth buffer into shader, like a texture for example, but Iam still a bit confused about how the depth buffer(texture) with FBO works.

First of all I found these two different approaches:

  
glGenTextures(1, &depthTextureId);
glBindTexture(GL_TEXTURE_2D, depthTextureId);

glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);

glTexImage2D( GL_TEXTURE_2D, 0, GL_DEPTH_COMPONENT, windowWidth, windowHeight, 0, GL_DEPTH_COMPONENT, GL_UNSIGNED_BYTE, 0);


glGenFramebuffersEXT(1, &fboId);
glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, fboId);

glFramebufferTexture2DEXT(GL_FRAMEBUFFER_EXT, GL_DEPTH_ATTACHMENT_EXT,GL_TEXTURE_2D, depthTextureId, 0);


second:


glGenRenderbuffersEXT(1, &fbo_depth_renderbuffer);
// Adjust the depth renderbuffer to the right size
glBindRenderbufferEXT(GL_RENDERBUFFER_EXT, fbo_depth_renderbuffer);
glRenderbufferStorageEXT(GL_RENDERBUFFER_EXT, GL_DEPTH_COMPONENT, windowWidth, windowHeight);
glBindRenderbufferEXT(GL_RENDERBUFFER_EXT, 0);

// Attach the depth renderbuffer to framebuffer 0
glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, fb);
glFramebufferRenderbufferEXT(GL_FRAMEBUFFER_EXT, GL_DEPTH_ATTACHMENT_EXT,GL_RENDERBUFFER_EXT, fbo_depth_renderbuffer)

1.) can someone explain me what is the difference ?

2.) When I bound a FBO to which the scene should be rendered to, after the scene rendering occurs I can find depth values of that rendered scene in texture associated with that FBO ? Or is there anything special that I should set up before this rendering ?

  1. In the first code snippet the depth buffer data is stored in a texture object. In the second one, it is stored in a render buffer object. The main difference is that you can fetch a texture but not a render buffer.

In other words, if you want to use the depth data in a shader, thus fetch its content or simply map the texture on a mesh, you need store depth data in a texture object.
If you don’t want to use the depth buffer content in any operation apart from removing hidden parts, a renderbuffer is fine.

  1. Yes the depth data is in the texture attached to the fbo depth attachment point.

The first renders to a depth texture, the second renders to a depth renderbuffer.

Depth textures can be bound to texture units / shader samplers for use in subsequent passes. Depth renderbuffers cannot (AFAIK).

2.) When I bound a FBO to which the scene should be rendered to, after the scene rendering occurs I can find depth values of that rendered scene in texture associated with that FBO ? Or is there anything special that I should set up before this rendering ?

Both. To the latter, just basic stuff you might guess. If you’re writing to an FBO that only has a depth buffer attachment, disable color writes, stencil writes/tests, enable depth writes/tests, clear the depth buffer, set your viewport to focus on your texture res, the usual stuff.

Iam finally getting somewhere :slight_smile: thanks a lot !

OK, so I’ve decided to use depth texture instead of depth buffer. I’ve got it all set up. My texture is always the same size as my screen resolution (i.e. my viewport size), I clear my depth buffer and enable depth writes, I don’t use stencil buffer, stencil writes should therefore be turned off. However color writes are turned on, because in one pass of my engine a draw whole scene with some masks on it into frame buffer and I try to display depth texture from this pass. But Iam getting somewhat strange results. You can see that there are some grey pixels, but only for the torus knot that is really really near the camera, all the other pixels are white. And I have a feeling that this is not correct. Can you tell what can be wrong ?
my scene - on the right side is part of torus knot that is very close to the camera

depth texture from the very same look on the scene

all the other pixels are white. And I have a feeling that this is not correct

that is correct, easily verifies with this code (+ move the cursor around the scene)
float depth;
glReadPixel( cursorx, cursory, GL_DEPTH_COMPENENT …, &depth )
cout << depth << endl;

most values are gonna be > 0.99 (i.e. white) to see the image darker use something like

glFragColor.xyz = vec3(pow( depth, 64.0 ));

And to complete what zed said, push the near plane a bit further from the camera and put the far plane the closest as possible to the camera. You will get better results especially when moving the near plane further.

Actually the depths are not stored in linear manner inside the depth buffer (or textures). Yes they vary from 0.0 to 1.0 but the values are exponential. To get the linear depth you need to calculate it using:


linear_depth = (2.0 * znear) / (zfar + znear - exp_depth * (zfar - znear));

znear and zfar are the camera’s near and far planes. exp_depth is the depth you read from the depth buffer