Normal Maps

I’ve got 2 Objects. One Highres, the other Lowres. The Texture can be applied to both and fits perfectly. Now i want to store all the n.x,n.y and n.z into a normalmap for per-pixel lighting. A simple question : how do i clamp the floats from the normals into bytes, to store in R,G and B channels of the texture ?

Is there something else, i could store in the normalmap’s alpha (That i could use for per-pixel precomputation)?

What is a good size for such a texture ? I think i will have to interpolate between the points, that did not have a vertex in the hires-obj. How is this performed ? Simply interpolate ?

Hope you can help me. And thanks in advance
Tom

This is really more of an algebra question.

Think about it, you want to convert a clamped floating point value to a byte. The floats SHOULD be all clamped on [-1, 1] since you’ll want to have a unit normal vector at each point.

The float values range on [-1, 1]. The byte values should range on [0, 255]. So, mathematically, we want a linear function f(x) where f(-1)=0 and f(1)=255. A simple way to do this (by no means the only way) is:

f(x)=[[127(x+1)]]

…or to be more precise:

f(x)=[[255(x+1)/2]]

Of course if you’re dealing with “signed bytes” as OpenGL likes to call them you can just use…

f(x)=127x

…for a fairly decent approximation. Which method you take depends on what you’re working on.

Thanks for the answer. I’m using r=1+127*nv.x and it is doing fine (I think i worried too much about precision loss). But the bigger problem is the interpolation on the pixels, that do not have a correspondig vertex with normals in the hires model. At bigger triangles linear interpolation tends to become “stairs-like” at closeups. What kind of interpolation should I use for the problem.
The second problem i have is that i preview the textures and decide about their res. Any idea how to compute the needed res out of the vertex-density and how big the model is ?

Tom

I meant r = 127 * (nv.x+1).

Tom