DirectX LinearizeDepth in OpenGL?

Heya all, I am trying to implement an Nvidia DirectX demo in OpenGL (and GLSL) and ran into the following call in their sample code:

m_TechniqueLinearizeD = m_Effect->GetTechniqueByName( “LinearizeDepth” );

This effect is then applied to their depth buffer before the buffer is used in the shader code. Is anyone familier with what this effect does?

Obviously it is attempting to Linearise the depth buffer in some fashion but I need to reproduce this effect in OpenGL… any seen this done before? Suggestions?

p.s. I tried google and MSDN, niether has any useful hits for “LinearizeDepth”

While I am at it, there are also these calls…

m_TechniqueND = m_Effect->GetTechniqueByName( “ResolveND” );
m_TechniqueD = m_Effect->GetTechniqueByName( “ResolveD” );

Anyone seen these ported to OpenGL before?

Those are just “techniques” that are defined in an effect file (*.fx). You can name them whatever you want, so “LinearizeDepth” has nothing to do with D3D or OpenGL. Take a look at the effect file. The technique “LinearizeDepth” simply states which shader to use and which API states to set. All that can be ported directly to OpenGL.

D3Ds effect system is really sitting on top of D3D, it is not a feature of D3D, it is a convenience library.

Same thing goes for “ResolveD/ND”.


Ahh much thanks Jan!!