A problem which has been bugging me for ages is the ability to ‘see’ the depth buffer.
For a while now, my deferred renderig engine has been making use of depth_stencil texture as the depth buffer. Whilst this works just fine on nVidia h/w, it has come to light that it (the stencil part) does not work properly on ATI Radeon 4850.
So, instead of depth_stencil textures, I now use a depth_stencil render buffer. Everything renders just fine.
I now need to ‘see’ the depth and display it (for debugging purposes) as a texture. When the depth buffer was a texture I had difficulty displaying the contents of the texture becuase it was only ever black when I bind it and apply it to a 2D quad on my GUI front end.
Now that I’m using a render buffer, I am trying to use glCopyPixels (assuming nv_Copy_Depth_To_Color) exists (which it does) but that just gives black as well. I’ve tried specifying GL_COLOR as the operation as well as DEPTH_STENCIL_TO_RGBA_NV but nothing - just black.
Has anyone else managed to get glCopyPixels to work and display the depth buffer ?
One solution is to write depth in a texture attached to one of your color attachment point when performing deffered shading. Just write the depth value in the rgb components, then it will be displayable on screen aligned quad.
In the MRT, one texture (GL_RGBA16F) is used to store x,y,z in the RGB channels and ‘depth’ in Alpha. The ‘depth’ is:
gl_FragData = vec4 (ecPos4.xyz, gl_Position.z/gl_Position.w);
The trouble is when the MRT texture is bound as a texture, the alpha channel is either all black, or solid white.
Ideally, I want to use the glCopyPixels to convert and ‘normalise’ the depth so I can get the scene as a nice grey scale texture.
I don’t really understand why glCopyPixels is convenient for your purpose.
Anyway, the alpha channel should be:
0.5 * (gl_Position.z/gl_Position.w) + 0.5
This way you have the depth values written in the depth buffer in the [0 1] range. After perspective division, z value are in the [-1 1] range so negative values are just displayed as black.
You can also write a shader that uses this texture and read the alpha component to display it as a grayscale in your debug display.
I made the shader alterations as suggested so the ‘alpha’ MRT target was 0.5 * (gl_Position.z/gl_Position.w) + 0.5
However, still does not display a nice grey-scale texture.
I don’t want to write a shader because this would be much slower than fixed-function pipeline and involves an extra rendering pass (full screen quad). Ideally, I would like to just set some glTexEnv parameter (like GL_LUMINANCE) and simply bind the texture and just render my GUI display element.
CopyPixel is convienient because I don’t have to write a shader to read the depthbuffer pixel by pixel as a separate pass, converting the format from depth to RGB. Rendering s full screen quad, also incurs state changes with the projection matrix, etc and is just not an elegant substitue for glCopyPixels.
Ok I see but I don’t know which method is more elegant since using nv_Copy_Depth_To_Color does not seem widely supported. I agree this extension offers a very straightforward method to display depth buffer content.
I have no more ideas. You might have in the end to use the method I suggested just to see if your hardware and especially drivers supports correctly the named extension or use ReadPixels.
Anyway I am pretty sure the shader method is as fast as the CopyPixels + nv_Copy_Depth_To_Color, since the shader is trivial and you can render directly to your gui with a quad and in the end who cares about performances in debug?
Answering my own post (eh?)
To use glCopyPixels you must first setup a 2D projection since the rasterposition is transformed by the modelview projection exactly like a vertex.
Now copypixels with the Copy_Depth_to_RGBA_NV works perfectly - I can now see the depth and the stencil contents.