I am just starting to use fragment programs and I have no idea about the following: when computing dot products to create per pixel lighting, somewhere in the vertex/fragment processing pipeline, things must be transformed from the range [-1…1] to [0…1], as the coordinates of normalized vectors are in the range first mentionend, while the normal vectors coming from the normal map (and the normalization cube mal) are in the second range. But where does this happen? Which vertex/fragment program instructions work on which ranges? Is it like ARB_texture_env_dot3, where the dot product texture combine function automatically rescales the vectors before computing the dot product? but this would mean that the parameters that are passed to the fragment (and before that to the vertex-)program have to be rescaled to be in the range of [0…1]? Please help me… I’m totally confused about this and cannot find any info on it.