I’d like to know how to set depth renderbuffers for multiple render layers.
For one layer renders you do something like:
glNamedRenderbufferStorage(renderbuffer, GL_DEPTH_COMPONENT32F, width, height);
glNamedFramebufferRenderbuffer(framebuffer, GL_DEPTH_ATTACHMENT, GL_RENDERBUFFER, renderbuffer);
How does that look if you have multiple layers via a geometry shader?
Btw. when should you use GL_DEPTH_COMPONENT32F (float) and GL_DEPTH_COMPONENT32 (int)? Is the internal set of numbers distributed in a different way?
You can’t use renderbuffers with layered rendering, you need to use textures.
Btw. when should you use GL_DEPTH_COMPONENT32F (float) and GL_DEPTH_COMPONENT32 (int)? Is the internal set of numbers distributed in a different way?[/QUOTE]
The first one stores depth values as floating-point, the second one as normalised values (i.e. a depth of 0.0 is stored as 0 while a depth of 1.0 is stored as 232-1).
If you construct a projection transformation such that depth is one at the near plane and zero at the far plane, the decrease in accuracy as distance increases is offset by the increase in floating-point resolution closer to zero. Also, the NV_depth_buffer_float extension allows depth values outside of the range 0.0 to 1.0.
Thank you for the quick answer.
I also found the following page from NVIDIA. May it enlight some other people having similar problems.