Depth-stencil texture get corrupt after reading

I have this strange problem about reading from depth-stencil texture

I tried to add stencil test to my already working deferred rendering engine.
So I create FBO with the following pack depth-stencil attachment.

glGenTextures(1, &depthBufferTexID);
glBindTexture(GL_TEXTURE_2D, depthBufferTexID);

/*1*/glTexImage2D(GL_TEXTURE_2D, 0, GL_DEPTH24_STENCIL8,  width, height, 0, GL_DEPTH_STENCIL,GL_UNSIGNED_INT_24_8, NULL);



I attached depth-stencil as a texture instead of render buffer
so that I can directly read from depth texture in lighting pass(to reconstruct vertex position ,etc…)

Actually the only thing I change from my whole source code is the line mark with /1/ and /2/

But after reading from the depth-stencil texture(to reconstruct vertex position in lighting pass),the depth-stencil texture appear to be corrupt.

Any depth-testing enabled rendering perform after the depth-stencil texture reading appered to had some weird hole on them,it is very hard to describe (picture below).

//corrupt picture

If I comment out the texture reading pass(the deferred lighting pass on background geometry),the subsequence rendering pass appeared correct.

//correct behavior picture

I dont think there are anything in the rendering process that can cause this,If I remove the line /1/ and /2/ and go back to using depth only attachment every thing worked again.

Note that in the whole process I didn’t even doing any stencil-related operation once.

Has anyone ever encouter something like this ?

test on ATI HD4670 with 10.5 driver.

Thank in advance.


Just try with ATI catalyst 10.6 driver and it still the same , weird hole on any object render after depth-stencil texture reading.

My brother’s geforce 9800gt doesn’t has this problem , any object render after the problematic depth-stencil texture reading are correctly drawn.

Guess I could avoid this by making a copy of depth-buffer(using blitting) just for reading or use my own depth format , but I would like to know the cause of the problem since this is very weird.


I decided to continue, by writing linear z into seperate buffer (GL_R32F formats) and use that instead of the old depth-stencil texture and just attach depth-stencil as render buffer.

This way I can using stencil test/depth reading in FBO with out any error on ATI.

But I still want to know what cause the error after reading the depth-stencil texture on ATI too since using seperate float buffer to store z/depth is a little slower compare to depth-texture solution.

We can’t reproduce the problem right now.
Could you please narrow down the problem and paste the whole codes to us?
Do you use lighting in your application?

Thank for your reply,I will try to create a test application.

It might take a long time too since I already modified that part of application to use the linearZ method and also had a deadline for my main job.

I think this one may be (again) my stupid mistake and yes I use shader for lighting (both deferred and forward part/no mixing of fixed-function code).

Thank :slight_smile: