Hello all,

I’ve been trying to implement normal mapping into my custom OpenGL shader. The result I’m getting is *almost* there, but it looks like the `Point Light`

is not illuminating the mesh evenly.

For context, I recently added lighting, added a `Point Light`

and a rectangle with a texture to do regular lighting and got this result:

This is a desired result, the `Point Light`

(it’s the small yellow cube above the rectangle) illuminates the rectangle with the brick texture evenly.

I then wanted to add a normal map, so instead of using the vertex normals, I’d use normals from another texture. The result I’m getting is this:

The result is *almost* there, the normal map is being applied, and the bricks almost look 3D. The only problem is that one corner of the mesh remains dark, even though the light is exactly in the middle of the rectangle, so I’d expect the illumination to be even.

In case it helps, here’s a gif that shows the Point Light moving in an `8`

pattern above the rectangle:

My current attempt is to get the normals from the normal map and convert them from Tangent Space to View Space (where the lighting takes effect already).

Here is the `Vertex Shader`

I’m using:

```
// vertex attributes
in vec3 colour;
in vec3 normal;
in vec2 texCoord;
in vec3 tangent;
in vec3 bitangent;
// output to fragment shader
out vec4 outColour;
out vec2 outTexCoord;
out vec3 outNormal_view;
out vec3 outLightPos_view;
out vec3 outPos_view;
out mat3 outTBN;
// light position in global space
uniform vec3 lightPos;
void main() {
// vertex positions
vec4 pos = matrix_projection *matrix_view *matrix_model *position;
gl_Position = pos;
// convert light and vertex positions to view space and pass them to fragment shader
vec3 lightPos_view = vec3((matrix_view *vec4(lightPos, 1.0)).rgb);
vec3 pos_view = vec3(matrix_view *matrix_model *position);
outLightPos_view = lightPos_view;
outPos_view = pos_view;
// convert vertex normals to view spacce and pass them to fragment shader
vec3 normal_world = mat3(transpose(inverse(matrix_view *matrix_model))) *normal.rgb;
outNormal_view = normal_world;
// TBN matrix to convert normals from tangent space to view space
vec3 normal_view = normalize( vec3(matrix_view *matrix_model * vec4(normal , 0.0)) );
vec3 tangent_view = normalize( vec3(matrix_view *matrix_model * vec4(tangent , 0.0)) );
vec3 bitangent_view = normalize( vec3(matrix_view *matrix_model * vec4(bitangent, 0.0)) );
mat3 TBN = mat3( tangent_view, bitangent_view, normal_view );
outTBN = TBN;
outColour = vec4(colour.rgb, 1.0);
outTexCoord = texCoord;
}
```

And here is the `Fragment Shader`

:

```
// input from vertex shader
in vec4 outColour;
in vec2 outTexCoord;
in vec3 outNormal_view;
in vec3 outLightPos_view;
in vec3 outPos_view;
in mat3 outTBN;
// texture and it's normal map
uniform sampler2D image;
uniform sampler2D image_normal;
void main() {
// normals from texture to view space
vec3 norm = avdl_texture(image_normal, outTexCoord).rgb;
norm = normalize(norm * 2.0 - 1.0);
norm = normalize(outTBN *norm);
// light values in view space
vec3 lightColour = vec3(1.0, 1.0, 1.0);
vec3 lightDir = normalize(outLightPos_view -outPos_view.rgb);
// `outNnormal_view` is using vertex normals and it works
// `norm` is using normal map normals, and it shows the dark corner
vec3 normal_view = outNormal_view;
//vec3 normal_view = norm;
float diff = max(dot(normal_view.rgb, lightDir), 0.0);
vec3 diffuse = diff * lightColour;
gl_FragColor = vec4(outColour.rgb *diffuse, 1.0) *texture2D(image, outTexCoord);
}
```

I’m still learning, but I can confirm in this specific scenario, the rectangle vertices have the Normal `(0, 1, 0)`

, Tangent `(1, 0, 0)`

and Bitangent `(0, 0, 1)`

. From what I understand this is how they should look like on a flat surface that points up, but I’m not sure if I’m messing something up in the `TBN`

calculations.

For reference, I’ve been using the tutorial from here: LearnOpenGL - Normal Mapping - the texture and normal map were taken directly from there, for learning purposes.