I’m writing a deferred shading system in webgl.
My g-buffer is a framebuffer object with the following attachments:
-Color: Floating point texture, packing screenspace normals, colors, depth, etc.
-Depth: Renderbuffer
I pack the depth into the floating point texture because my system doesn’t support depth textures.
After filling the g-buffer, I do the light accumulation pass onto the default framebuffer and this works okay.
But after this, I want to draw some transparent objects using traditional forward shading.
But I want these transparent objects to be occluded by deferred shaded objects.
The problem is my default framebuffer’s depth buffer doesn’t really have useful information in it.
One solution is this:
While drawing the transparent objects, in the fragment shader, sample the g-buffer, compare the depth in the g-buffer with the depth of the pixel, and discard the pixel if it fails this depth test (by setting alpha = 0 or something).
This seems inefficient, because we’re sampling a vec4 from a float texture, just to read one value. Also, this precludes any early fragment depth testing.
What I would like to do is take my g-buffer’s depth renderbuffer and “attach it” to the default framebuffer, or something to that effect.