Displacement Maps and Lighting

So implementing displacement mapping via a vertex program is quite straight forward but lighting it is proving to be tricky.

Assuming one has a working vertex shader for displacing a surface… anyone have a good plan on how to light it?

Specifically my normal is now wrong and I don’t have any way to fix it from within the vertex/fragment shader (as I only have 1/3 of the vertices that make up the triangle).

Static lighting is not an option… I need correct normals (or at least fairly close) normals for the rest of my fragment program.

Bump mapping methods (ie perturbing the normal based on the rate of change in the displacement map) seem to work well as long as the displacement is small and gradual. However once you move the surface a significant amount (especially if you do it sharply… like a faceted surface) then the bump mapping techniques seem to perform poorly.

I guess what I am really looking for is a hack to be able to calculate a new surface normal inside one of my shaders. Anyone read anything interesting on this?

What is the law of you displacement function?

If it is mathematically-well described, you can evaluate gradient funciton in vertex shader and pass it’s normalized value as correct normal.

There is no displacement function… its a user provided texture (ie I have zero control over what is in it). So no luck evaluating it.

Also I should mention that a single texture can be applied to multiple surfaces in the scene. Thus no surface specific information can be stored in the texture (eg can’t store say the normal in rgb and displacement in a).

How do you displace your vertex?

If you displace it in the direction of your vertex normal vector by the length of displacement value, stored in the texture, so you can try doing, like tangent-space pixel bump, but in vertex manner.

Ya I tried that and it works well until you start working with faceted surfaces with larger displacements (which unfortunately I need to support).

I am basically using the following concept to perform the displacement…

http://www.ozone3d.net/tutorials/vertex_displacement_mapping.php

Originally posted by PickleWorld:
Also I should mention that a single texture can be applied to multiple surfaces in the scene. Thus no surface specific information can be stored in the texture (eg can’t store say the normal in rgb and displacement in a).
you can store the normal in the texture, and then follow the rules of any tangent space normal map to make it work where ever and on whatever you place the texture. Your to-be-displaced mesh just needs mesh normals and tangent vectors. Also, it is possible to encode the full 3-vec normal in the alpha of the texture.

Ok, PickleWorld, I think, I’ve got your problem.

You do the displacement, and it seems to be very coarse, yes?

So, you could do the following:

  1. Implement bilinear filtering in vertex fetch, as described in that paper
  2. With bilinear filtering, try to increase you mesh’s tesselation, because geometric shaders are still future, and tou should take care about proper subdivision in order to get smooth results from interpolation.

Hope, that helps…

PS: By the way, some screenshots will help to get the problem clearly ))