Nice to be here.
I’m still pretty new to the OpenGL graphics pipeline, but skilful enough to comfortably swing around VS, FS, FBOs and a few rendering methods with a up to a few model count.
I’ve gotten to a point where I’m interested in animated mesh deformation. The mesh could be animated by texture displacements, by modifiers(bend, taper, etc.) or some other math operations.
The problem is that after I’ve done my deformation I don’t have a clear understanding how to recalculate normals? Methods I have tried gave an unwanted faceted look.
Two deformation example scenarios would be (both done in VS):
- A plane displaced by an animated noise.
- A cylinder bending into a C form.
In the OGL super bible such a topic is not covered. From all of my web searches I’ve found the term “per vertex normals” from lighhouse3D articles. (Can’t include link) But the article suggests averaging out face normals that share the vertex.
It seems as a pretty good way to go. The key to use this method involves knowing the data of neghbouring faces, so I looked at the geometry shader, but it does not have a function that returns the neighbour data. There a few other hints around the web that suggest that says it’s somehow possible, but no guide at all.
Could someone describe a bit better what I’m searching for or maybe guide in the right way?
While writting this I thought that it could be possible to store the neghbouring face data in an attribute, but it seems a bit heavy for one vertex to carry and manipulate extra data that is already out there.
Thank you in advance,