Lighting terrain issue (diffuse and spec because of normals)

Hi,

I am creating a procedural terrain and I have some issues with lighting as you can see there https://imgur.com/a/fadvHt9.

I have a grid with heights and a gradient function for each point based on interpolation function and smooth functions. I also can render normals and they seem to be good. The problem is when normals are interpolated for each triangle, it should be bilinear interpolation in my case. So the interpolated normal is not correct but not so far from the correct one if there are enough triangles (~16000 in this case).

Here is my code for fragment shader :
#version 430 core

out vec4 out_color;
  
in vec2 uv_frag; // the input variable from the vertex shader (same name and same type)  
in vec3 frag_pos;
in vec3 frag_norm;
in vec3 frag_color;

uniform vec3 frag_eye;

void main()
{
    vec3 normal = normalize(frag_norm);

    vec3 light_color = vec3(1.0f, 1.0f, 0.0f);//vec3(0.1f, 0.35f, 0.1f);
    vec3 light_pos = vec3(0.0f, 100.0f, 0.0f);

    vec3 light_dir = normalize(light_pos - frag_pos);
    vec3 view_dir = normalize(frag_eye - frag_pos);
    vec3 reflect_dir = reflect(-light_dir, normal);

    float light_strength = 100.0f;
    float ambient_weight = 100.0f;
    float diffuse_weight = 1.0f;
    float specular_weight = 0.5f;

    float ambient = ambient_weight/distance(light_pos, frag_pos);
    float diffuse = diffuse_weight * max(dot(normal, light_dir), 0.0f);
    float spec = specular_weight * pow(clamp(dot(view_dir, reflect_dir), 0.0, 1.0), 2);

    float color_normal_ratio = atan(normal.y/sqrt(normal.x * normal.x + normal.z + normal.z));

    color_normal_ratio = abs(color_normal_ratio);
    color_normal_ratio /= 1.57079632679;

    vec3 colorB = color_normal_ratio > 0.5 ? vec3(0.0,0.33,0.1) : vec3(0.36,0.30,0.16);

    vec3 color = light_strength*((ambient+diffuse+spec)/(ambient_weight + diffuse_weight + specular_weight))*light_color*colorB;

    out_color = vec4(color, 1.0);
} 

The 3 lightings can be seen separately there https://imgur.com/a/JiDre2A
Both specular and diffuse participate to the problem.

My model matrix is always equal to Identity so there shouldn’t be a problem with the transpose of inverse upper 3x3 model.

Thank you !

PS : In the pictures, values are exaggerated compared to reality because of missing weights.

1 Like

Can I see diffuse only, weighted a bit (details are lost in your images) and with fixed color_normal_ratio (color borders distract)?. Showing vertices would be nice too (rendering second time as GL_POINTS, with flat color). And showing vertex shader is mandatory, there may be problems as well.

Here is the vertex shader :
#version 430 core

layout (location = 0) in vec3 vertex;
layout (location = 1) in vec3 normal;

out vec3 frag_pos;
out vec3 frag_norm;
out vec3 frag_color;


uniform mat4 modelview;
uniform mat4 projection;


void main()
{ 
    frag_pos = vertex;
    frag_norm = normal;

    

    gl_Position = projection*modelview*vec4(frag_pos, 1.0); // set the output variable to a dark-red color
}

I updated pictures on this link https://imgur.com/a/JiDre2A
Shaders are not cleaned for the moment, some variables are not used, etc.

Am I right the grid-like pattern is the problem?
The code seems OK, it is probably the data what is wrong. You use quads (even if you split them into triangles manually). But the interpolation is bilinear so can’t interpolate between 4 arbitrary values. Try adding a vertex in center of each quad, calculating normal for it manually (averaging from all 4 vertices).

I am not sure I understood correctly what you’ve said. I have a function that gives me height for (x, z) value and another one for the gradient. Then I create the grid for the renderer using the height function, I could also have choosen another mesh, like a spider web.

The bilinear function is only used to get the height given for the 4 vertices of the containing quad in the grid (https://en.wikipedia.org/wiki/Bilinear_interpolation). But this is not the same grid than the renderer.

The problem I have is that edges are like enlightened.

Could you explain again please, because I didn’t understand. My bilinear interpolation can interpolate between 4 values, I don’t see why the 5th point would solve this.

Sorry and thank you !

PS : https://stackoverflow.com/questions/35604947/opengl-3d-terrain-lighting-artefacts
It seems to be the same problem except for normalizing the normal in the fragment shader.

PS 2 : I updated https://imgur.com/a/JiDre2A and now you can see that fragment normals are not correct (the linear interpolation in the fragment shader outputs some weird things)

PS3 : Maybe the problem is just the resolution because of the difference between linear and bilinear interpolation. I could try to bilinear interpolate normals in the fragment shader but for the moment I posted a new picture on this link. If I reduce the scene dimensions, it is like if I increase resolution and the result picture does not seem so bad, what do you think ?

How are you generating your vertex normals?

For a height map, you’d typically use something like:

float dx = height[y][x+1]-height[y][x-1];
float dy = height[y+1][x]-height[y-1][x];
float dz = 2*grid_size;
vec3 norm = normalize(vec3(dx, dy, dz));

Another thing which can cause artefacts is if the vertical resolution of the height map is too coarse (e.g. if the height map is stored as a monochrome image with 8 bits per pixel, meaning that you only have 256 discrete height values). If the quantisation error is significant, this will introduce visible creases along the grid lines.

Well let A, B, C, D the points of the containing quad, vA, vB, vC, vD the corresponding vec2 and P(x,z).

Let
Fx(x, z) = (P-A).vA-(P-B).vB,
Fz(x, z) = (P-D).vD-(P-A).vA,
Fxz(x, z)=(P-A).vA+(P-C).vC-(P-B).vB-(P-D).vD
Gx = B.x-A.x and Gz = D.z-A.z

(r, s) is (x, z) translated to A (r, s)=P-A

The formula for interpolation I use is
F(x, z)=Fx(x,z)r/Gx+Fz(x,z)s/Gz+Fxz(x,z)rs/Gx/Gz+(P-A).vA

grad(F)(x, z)=(vB-vA)r/Gx + Fx(x,z).(1/Gx, 0) + (vD-vA)s/Gz + Fz(x,z).(0, 1/Gz) + (vA+vC-vB-vD)rs/Gx/Gz + Fxz(x,z).(z/Gx/Gz, x/Gx/Gz) + vA

so if grad(F)(x, z)=g the normal is n=normalize(-g.x, 1.0, -g.z).

I have 10 octaves for a Perlin noise with 2^k frequencies. For the last picture I posted there are 9 chunks (3x3) so there are around 3000 values.

I was using the method you mentionned but I had issues on the border of the grid because the normal would change with chunk generation (mean of 2 triangle normals to 4), and I would have like to avoid this discontinuity.

Also I wanted to use the grad function to generate other things on my terrain like rivers etc.

Bilinear (IIRC) interpolation is what is used to compute frag_normal in the fragment shader from frag_normal in the vertex shader. The vertex shader is executed once for each vertex, calculating frag_normal. Then for each face (triangle), fragment shader is executed for each pixel; the frag_normal it gets is interpolation of the frag_normal values calculated for triangle vertices. The interpolation is effectively linear at triangle edges.

Note that you can disable interpolation using the flat qualifier; that won’t give you what you need in this case, but may be useful for debugging.

Now, take a look at this drawing:
triangulation

Triangles are what is rendered. On the left, you see how GL_QUADS are usually rendered, and how one usually splits a quad manually. But as interpolation is done on per-triangle basis, normal at P is average of these at A and C; B and D are effectively ignored. That may cause darker appearance near P.

What I suggest is adding an additional point, and splitting a quad into 4 triangles instead of just 2, as you can see on the right. You will have to compute the normal for Q manually, but you will be able to do that correctly, averaging normals of A, B, C, and D.

Another alternative is to compute the normal directly in the fragment shader instead of relying on interpolation (that may give you way better result in fact). But it will need more data than it has, at least heights at each quad vertex. Shaders (except of TCS maybe) can’t do that directly, but if you choose this way, my suggestion is to store the height map or the normal map (possibly at a higher resolution than your vertex grid) in a texture, and use that in the fragment shader.

Ok so the 1st method is equivalent to increase resolution with a better distribution of points but It does not really resolve the problem. I’ll try the texture method. Thank you :slight_smile: