setup for fbo with depth/stencil renderbuffer

i start to feel uncomfortable for asking 2-3 questions a day, and here’s another one already ^^

i have a multipass shadow volumes algorithm and a camera attached to one of the light sources. this camera renders to a fbo (which has a depth renderbuffer attached to it) and the fbo’s target texture is placed on the front wall. as you can see, the shadows aren’t visible. i realized that i have to add stencil support to my fbo and the EXT_packed_depth_stencil extension seems to be what i have to use. but after reading the spec i’m confused even more: in the sample code they create a texture to store depth and stencil information, but i actually thought i had to store this information in a renderbuffer? thanks!

If I get this correctly, you can use either renderbuffers or textures, just like the usual FBO.

ok, but i don’t want a 2nd texture, all i want to add is support for the stencil buffer.

Then use a renderbuffer for it :slight_smile: Just initialize it to be DEPTH_STENCIL instead of DEPTH

mh doesn’t work.
ok if i create a fbo with a depth texture, it works fine. but if i substitute:

glTexImage2D( GL_TEXTURE_2D, 0, GL_DEPTH_COMPONENT16, m_uiWidth, m_uiHeight, 0, GL_DEPTH_COMPONENT, GL_UNSIGNED_BYTE, 0 );

with:

glTexImage2D( GL_TEXTURE_2D, 0, GL_DEPTH24_STENCIL8_EXT, m_uiWidth, m_uiHeight, 0, GL_RGBA, GL_INT, 0 );

and:

glFramebufferTexture2DEXT( GL_FRAMEBUFFER_EXT, GL_DEPTH_ATTACHMENT_EXT, GL_TEXTURE_2D, m_pkTarget->GetId(), 0 );

with:

glFramebufferTexture2DEXT( GL_FRAMEBUFFER_EXT, GL_DEPTH_ATTACHMENT_EXT, GL_TEXTURE_2D, m_pkTarget->GetId(), 0 );
glFramebufferTexture2DEXT( GL_FRAMEBUFFER_EXT, GL_STENCIL_ATTACHMENT_EXT, GL_TEXTURE_2D, m_pkTarget->GetId(), 0 );

… which is what the spec proposes, then i get a black texture. i just realized that i even get a black texture when i try to use a 24bit depth texture instead of a 16bit one… is it possible that my good old radeon9800pro is simply not able to handle more than 16bits depth?

You should be seeing a GL_INVALID_ENUM error from this call:

glTexImage2D( GL_TEXTURE_2D, 0, GL_DEPTH24_STENCIL8_EXT, m_uiWidth, m_uiHeight, 0, GL_RGBA, GL_INT, 0 );
When the format parameter is set to GL_DEPTH_STENCIL_EXT or GL_DEPTH24_STENCIL8_EXT the type must be GL_UNSIGNED_INT_24_8_EXT.

Previous oratour is totally right )))
See this: http://www.opengl.org/registry/specs/EXT/packed_depth_stencil.txt

mh ok but in the same document they do this:

// Setup depth_stencil texture (not mipmap)
glBindTexture(GL_TEXTURE_2D, tex_depthstencil);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexImage2D(GL_TEXTURE_2D, 0, GL_DEPTH24_STENCIL8_EXT, 512, 512, 0, GL_RGBA, GL_INT, NULL);
glFramebufferTexture2DEXT(GL_FRAMEBUFFER_EXT,
GL_DEPTH_ATTACHMENT_EXT, GL_TEXTURE_2D, tex_depthstencil, 0);
glFramebufferTexture2DEXT(GL_FRAMEBUFFER_EXT,
GL_STENCIL_ATTACHMENT_EXT, GL_TEXTURE_2D, tex_depthstencil, 0);

mh whatever - still doesn’t work :stuck_out_tongue:

That is their bug, and I remember they had a post on this forum that thay have mistaken is the specification.
Try using this:

glTexImage2D(GL_TEXTURE_2D, 0, GL_DEPTH24_STENCIL8_EXT, 512, 512, 0, GL_DEPTH_STENCIL_EXT, GL_UNSIGNED_INT_24_8_EXT, NULL);

This should work anyhow.

Sorry for the confusion. Jackis is correct that this is a bug in the spec example which was fixed in a later version of the spec (version #12, dated Sept 26, 2005). Unfortunately the registry still has an older version (#11, Sept 20, 2005). I’ll make sure the registry gets updated with version #12. Jackis’ revised example matches the example in version #12.

thank you, but this didn’t help either. to repeat my question, is it possible that my good old radeon9800pro is simply not able to handle more than 16bits depth?

Did you check GL for any errors?
Is your framebbuffer complete after all you’ve done?
Try to scan GL for errors via glGetLastError() before and after this glTexImage2D() string, if everything is okey, ask FBO object for completeness via checking glCheckFramebufferStatusEXT() to return GL_FRAMEBUFFER_COMPLETE_EXT.

Unfortunately, I’ve did it on nVidia cards only, they really do support it (24 depth, 8 stencil). I had no problems.

BTW: for waht do you need stencil exactly? As I understood, you render simple shadowmap, you get depth - and that’s all you need! Or if you are using shadow volumes - then why do you need to place camera in the light position?

BTW^2: did you ensure, that your driver do support EXT_packed_depth_stencil extension?

BTW^3: read this, it’s an Eric Lengyel blog, it’s all about ATi and their FBO/dept_stencil support. The very first issue, dated May’06. I don’t think, that there are any positive motions from that ((
http://www.terathon.com/eric/blog.html

is it possible that my good old radeon9800pro is simply not able to handle more than 16bits depth?
No. Every graphics card of note (that’s not a 3DFx one) was capable of 24-bit depth buffers ever since TNT and TNT2.

I made a quick search and it seems to be true: ATI just does not support this extension oO
Well, I guess this is the reason why I stuck with nvidia for last years…

2 JeffJ

BTW, EXT_packed_depth_stencil is still dated with revision #11 (((

thank you for all this information! so it seems that there’s no way to get fbo’s working with depth/stencil on an ati card atm, right? it’s time to switch to nv i guess -.-

I think there is only one solution how to solve this problem on ATI. You have to render your scene to the standard framebuffer with a stencil buffer support and then copy it to the texture via glCopyTexImage2D. It seems you’ll have no choice, if you really need a stencil buffer.