Depth minmax?

My scenes have an extreme depth dynamic range. You can see objects a couple centimeters away, and vast terrain for hundreds of miles at the same time. I’m trying all the tricks in the world (depth sorting/glDepthRange/depth clamping), to maximize my depth resolution, but it’s just never enough.
It would be very handy, if I could query the minimum and maximum z fragments that have been rendered. It would be nice if it worked like the occlusion query, ie I could create multiple queries. Then, I could use that information to set the near plane at the next frame, to maximize my depth buffer resolution.

I know you’re not asking for advice, and I’m not sure if this helps you, but what I’ve done in the past to work around problems like this is render one scene far away and one scene up close. Essentially, there are two cameras: camera one has a near plane of 1000 and an infinite far distance, and camera two has a near plane of about 0.5 and a far distance of just over 1000. By rendering the scene with camera one, and then with camera two, you get quite a bit more depth precision. This can be extended to any number of cameras, but be aware, problems with transparency will arise. Those, I solved by getting creative with the stencil buffer.

  • Kevin B

Yes, this is a good solution, and we already use it (hence the use of DepthRange), but this still does not help us place the near plane for the “nearest scene”. Usually, we push out the near plane to the nearest bounding sphere, but what do you do when the camera is inside the nearest bounding sphere? Currently, we just set the near plane “close enough”, enable depth clamping, and pray. :slight_smile: If we had depth minmax, we could halve the near distance, until all Z values are above some threshold.