Hi guys. I’m trying to get a diffuse light onto an object (looking over it so it’s basically 2D), then save this to file, I have created a prediction algorithm to determine the exact position of the light source but it always goes off course after a while. The algorithm renders various test light positions and compares this (byte by byte) with the original image, ending up with a numeric difference between the images, selecting the closest point and testing within the particular segment represented by that point.
So, how accurate is the lighting model employed in OpenGL? As some points further away from the original light source seem to score higher than those that are actually closer. Also, is there some way that I can make the light change more varied for various distances, so it light has a wider spectrum and light intensity degrades regularly?

Thanks for any help. I’m very close to getting the same position as the source, but it just seems to deviate near the end.

Open GL has a vertex lighting model. When you create a single large quad and light it with a light that is very close to the center, it will actually appear dark. Why? Because the light is at a steep angle from each vertex’s normal. The standard OpenGL pipeline works better on tesselated objects. If you want a more accurate scene, subdivide the faces a lot. Also, if you write your own lighting in a vertex/fragment program, you will get accurate results without having to tesselate your surfaces.