Question about normals calculation


I have another question again concerning the calculation of normals. In my system I am calculating normals for various models using smoothing groups.

The idea is pretty basic and standard from what I have seen from people facing similar tasks.

  1. Iterate the triangles and calculate surface normals
  2. Count how many surfaces a vertex belongs to and sum all the surface normals each vertex belongs to
  3. Normalize the sum for each vertex.

This along with point light (calculated at vertex shader) seems to be rendering some test .3ds models beautifully.

A problem came up when I tried to render some glyphs. I have the vertices of each glyph(a letter) and the triangles that should be formed but no normals. So I went about to calculate them the same way I did for the other models.

The results can be seen in the screenshots below. Do you think I might have done something wrong or calculating normals this way is not a solution that fits every model?

1 view of ‘G’

2 view of ‘G’

1 view of ‘F’

2 view of ‘F’

You can easily spot my problem. Especially visible at G above. Some of the triangles that were created in order to triangulate the glyph’s outline are still visible and are standing our really uglily. This happens from most angles of view but from some angles as you can see the glyph is rendered nicely. Is this a problem with my normals calculation? Or could it be something totally different that I am missing.

I also tried a different normals calculation quite similar to the algorithm I mentioned above but with the added difference that I only added a surface’s normal to the vertex’s normal if it did not have any additions from that surface’s plane before. So that one vertex could get only 1 normal in its sum from each plane. But that did not work out either.

Thanks for your time. Any help is appreciated.

Edit: I would like to clarify something. The glyphs are comprised of 1 set of points duplicated in the z axis and then with additional triangles formed in order to connect the set of points of the glyph with its extrusion in the z axis.

The problem lies with the surface normals of these extra triangles getting added in the normals calculation. This is what produces the output from the screenshots above.

If I did not include these extra (let’s call them extrusion triangles) extrusion triangles in the normals calculation then the glyphs appear beautifully with the exception of when being viewed sideways where there is only blackness. I presume due to a normal of 0. You can see what I mean below.

‘G’ rendered nicely since the extrusion triangles are not taken into account

‘G’ again with the same calculation but this time looking at the extrusion triangles and seeing only blackness.

Well the problem seems to be that your including exactly every vertex and treating each vertex the same for multiple polygons.
The best way is pretty much the method you described but for one change in step 2.

instead of “sum all the surface normals each vertex belongs to”

it should be “sum all the surface normals each vertex belongs to where the dotproduct with the original one is greater than 0.71f”

Hello zeoverlord and thanks for the answer.
Indeed as you accurately point out the problem could be solved if I duplicated all the vertices and treated the “extruded” triangles as a different model with different normals. But that would require me to duplicate all the data in the VBOs for all the glyphs and I would like to avoid it if possible.

You point out a method but maybe it’s because of the lack of sleep that it is not totally clear to me. Can you help me disambiguate it a bit? You say

where the dot product with the original one is greater than 0.71f

The dot product of what with what? The dot product of the vertex with the original surface normal? By what you wrote I am not sure what to apply the dot product to.

I realize that a dot product greater than 0.71f means an angle less than 45 degrees between whatever vectors it is you meant. Could you please clarify?

In order to smooth with smoothing groups, you take normalized sum of contributing faces only.

Sometimes you do need to duplicate vertices - typically where you have a smoothing group edge. You can avoid this in some cases, but to keep your code simple, it is a good idea to first compute “corner” normals. By corner normal I mean you compute normal for each corner of each face, as if the vertices were duplicated. After these have been computed you merge all corners that have matching values, so you don’t duplicate identical vertices to vertex buffer.

Each corner belongs to exactly one face. Each corner has a number of contributing faces, one of them the corner owner face, and you can compute the angle between normals of two faces.

If your smoothing group is defined with a max smoothing angle, you compute corner normal as normalized sum of face normals around corner, but skip those faces which angle to the owner face is larger than max smoothing angle. Dot product is best tool to compare angles between normals, as it returns cosine of the angle between normalized vectors. Thus you can use cosine of max smoothing angle as the threshold.

For these glyphs you probably want a max smoothing angle less than 90 degrees, and as the faces are flat 45 degrees could be a safe value.

if I duplicated all the vertices and treated the “extruded” triangles as a different model with different normals. But that would require me to duplicate all the data in the VBOs for all the glyphs and I would like to avoid it if possible.

Duplication of vertices (and their attributes, including vertex normals) on ‘sharp edges’ is very common: everyone does it all the time. You should only worry about this if geometry throughput becomes a bottleneck in your application.

As others mentioned, you can calculate the angle between each triangle normal vector, and if they differ more than ~45 degrees (you may need to fine-tweak this value per glyph), ignore their influence when accumulating the average for the vertex normals.

And to clarify: in some modeling software, sharp edges can be manually defined through the ‘smoothing groups’ that tksuoran mentions. Essentially this is a group identifier (often implemented as bit flag) per triangle, that allows the algorithm to determine whether two neighboring triangles should share their vertex normals, or if they should be duplicated so that a sharp edge will appear.

Using smoothing groups may be more convenient for rendering 3d characters, as it may avoid the need to tweak the angle criteria I mentioned above.

Imagine a closed cylinder primitive. The top cap may have smoothing group ID 1, the cylindrical part ID 2 and the bottom cap ID 3. When accumulating the average vertex normals, you only add up the triangle normals of the neighboring triangles that have matching IDs.

Here’s a related article I wrote on the subject that may give you some ideas:
(But you should ignore the surface area/triangle corner angle part until you’ve got the basics right.)

Further more, you should be able to construct smoothing groups for the extrusion part by using the glyph point info (assuming that’s available), e.g. assign a unique ID to each bezier curve segment. And of course, the front and back side should have their own unique IDs so that it never collides with the IDs assigned to the extruded parts.

thanks for your insight guys. This approach makes sense. I will get to coding a solution right now. I will code the corner, or weighting by angle solution as you both refer to but I will also code the duplication of vertices. Remdul special thanks for the article. It was enlightening.

I will have to see which would be more preferrable, the reason being that in my system the user should be able to use lots of glyphs, even from different fonts and so duplicating each glyph’s vertices might become a bottleneck. Will have to think about that.

And remdul I do have the glyph point data, along with anything else inside the .ttf file available to me through a .ttf file reading library I had made in the past, so the easiest way (for the glyphs case only ofcourse) would be to just use that.