I’m new to OpenGL and I’m trying to get a function to calculate surface normals so I can put lights in my program. I’ve looked at a bunch of tutorials (NeHe, gametutorials) and they all pretty much say the same thing: get two vectors, find the cross product, normalize. I’ve tried this and it doesn’t seem to work correctly. Can somebody plz help? I’m desparate…

Here’s my function to get the surface normal:

`void VectorNormal(point_t p1, point_t p2, point_t p3) { point_t cross, v1, v2, normal; float len; v1[0] = p3[0] - p1[0]; v1[1] = p3[1] - p1[1]; v1[2] = p3[2] - p1[2]; v2[0] = p2[0] - p1[0]; v2[1] = p2[1] - p1[1]; v2[2] = p2[2] - p1[2]; cross[0] = ((v1[1] * v2[2]) - (v1[2] * v2[1])); cross[1] = ((v1[2] * v2[0]) - (v1[0] * v2[2])); cross[2] = ((v1[0] * v2[1]) - (v1[1] * v2[0])); len = (float)sqrt((cross[0] * cross[0]) + (cross[1] * cross[1]) + (cross[2] * cross[2])); cross[0] /= len; cross[1] /= len; cross[2] /= len; normal[0] = cross[0]; normal[1] = cross[1]; normal[2] = cross[2]; glNormal3fv(normal); }`

point_t is a typedef of an array of 3 floats. Can anyone see whats wrong w/ this code? Thanks for all your help!