Normal Map generation?

Hi, im not sure if this is advanced or not.

but basically im trying to find out how i can actually go about programming my own normal maps for some software im writing?

Evrytime i google it i just find nvidias tools and crazybump ETC, i would like to have ago at actually creating my own normal maps?

Either from a highpoly model using openGL?

Or a way todo it from a RGB image?


Im not sure what the question is really. Do you want to know how to calculate the normals, or how to create a texture that will hold normals that you have already calculated?

Yes, you know the normal maps that you use for well normal mapping. The blueish colored textures.

Well i want to be able to create my own, not using a tool as in calculate the normal map my self.

How are they created is more my question i guess.


You didn’t get my question. I know what a normal map is. What I meant was, do you need help with the ‘mathematics’ of generating a normal map (i.e, do you know how to calculate a normal?), or the OpenGL functions required to simply transfer the normals onto a texture?

If its the former, read this:

Ahh rite sorry. What i am trying to understand is the algorithm or process to create a normal map texure?

If i create a mesh and set its normals? How can i then make a normal map texture for that mesh? (maybe a.k.a texture baking?)

So, yes maybe maths, and functions? But im not acking for somone to just tel me all that, im 100% willing to read and learn, just not really “getting” how a normal map texture is generated?

For instance? When i used to make them, i would grayscale my texture in photoshop and use nvidia’s plugin to create a normal map from that texture? What is the plugin doing?

And is that the same process as creating them from a 3D mesh?


The grayscale-to-normal-map is simpler than the proper normal baking from a highly detailed model.

Basically it is rouhgly like a edge detection filter.
For each source texel, find the difference with next horizontal texel :

  • no difference : means pointing straight up
  • 255 units difference : maximum direction to the left
  • -255 : max direction to the right
    So to sum up :
    normal.x = sin(arctan(-difference.x/texelscale))

Same, with vertical direction :
normal.y = sin(arctan(-difference.y/texelscale))

And of course, the z part, to make the normal vector unit length :
normal.z = sqrt ( 1.0 - normal.x2 - normal.y2)

Doing a normal-map from highres 3d mesh is more involved, one as to cast parallel rays from each low res triangle to the nearest highres polygons, retrieve the normal, then store it on the normal texture. I admit I don’t much about how to do that in practice.

If i create a mesh and set its normals? How can i then make a normal map texture for that mesh? (maybe a.k.a texture baking?)

That isn’t enough information to generate a normal map. The tools you referred to create normal maps using:

1: A high-quality mesh, with far too many vertices to use in a real-time application.

2: A low-quality version of the high-quality mesh.

3: Texture coordinates for each vertex of the low-quality mesh. This tells the tool how to find a particular texel for a point on the low-quality mesh.