problem of lighting with .3ds files

Hi,
I have a problem with my models : they aren’t lighted completly. Actually, the model which i test is lighted correctly excepted on several parts. I have checked some normals which i compute and there are right.
I describe what my program does :

-i make the computations for the weight normals
-i compute the normals
-i multiply the vertices by the transformation matrix
-i multiply the normals computed by the 3x3 part of the transformation matrix
-i swap z by -y for the normals and the vertices

when i multiply the vertices by the transformation matrix my model appears at a place which is not the place where it is supposed to appear (farrer with one of my models’ file).

i put the code of the multiplication by the transformation matrix :

 void loader3ds::multiplyMatrixVertices(float *tMatrix, float *vertices, unsigned short *nbVertices)
{
	float normalsTemp[MAX_VERTICES*3] ;

	for(i=0 ; i < *nbVertices ; i++)//la 4ème coordonnée d'un point est 1
	{
		for(j=0 ; j<4 ; j++)
		{
			normalsTemp[i*3+j] = tMatrix[j*3+0] * vertices[i*3+0] +
					     tMatrix[j*3+1] * vertices[i*3+1] +
					     tMatrix[j*3+2] * vertices[i*3+2] +
					     tMatrix[j*3+3] * 1;
		}
	}

	for(i=0 ; i < *nbVertices*3 ; i++)
	{
		vertices[i] = normalsTemp[i] ;	
	}
}

void loader3ds::multiplyMatrixNormals(float *tMatrix, float *normals, unsigned short *nbVertices)
{

	float normalsTemp[MAX_VERTICES*3] ;
	for(i=0 ; i < *nbVertices ; i++)
	{
		for(j=0 ; j<3 ; j++)
		{
			normalsTemp[i*3+j] = tMatrix[j*3+0] * normals[i*3+0] +
					     tMatrix[j*3+1] * normals[i*3+1] +
					     tMatrix[j*3+2] * normals[i*3+2] ;
		}
	}

	for(i=0 ; i < *nbVertices*3 ; i++)
	{
		normals[i] = normalsTemp[i] ;	
	}
}
 

the part which compute the per vertex normals :

 //nIndice is the number of the normal from 0 to nbVertices
for(nIndice=0 ; nIndice < myObjects[counterObj].nbVertices ; nIndice++)
	{
		for(i=0 ; i < myObjects[counterObj].nbPoly ; i++)//polygon per polygon
		{
			for(j=0 ; j< 3; j++)//the three vertices of each polygon
			{
				if(myObjects[counterObj].myIndices[i][j]==nIndice)
				{
					myObjects[counterObj].perVertexNormals[nIndice][0] += myObjects[counterObj].perFaceNormals[i].x*angle[i][j] ;
					myObjects[counterObj].perVertexNormals[nIndice][1] += myObjects[counterObj].perFaceNormals[i].y*angle[i][j] ;
					myObjects[counterObj].perVertexNormals[nIndice][2] += myObjects[counterObj].perFaceNormals[i].z*angle[i][j] ;
				}
			}
		}
	}
 

You don’t seem to divide the vertex normals by the number of planes that use them, so you don’t take the average, you take the sum.

The effect of incorrect vertex normals would be patches that are too dark and patches that are too bright.

Either divide the vertex normals by the number of surfaces, or renormalise the normals. (The effect should be the same)

sorry, i have forgotten to say that i ever normalize. I didn’t put the code but i do it.

I don’t understand what you’re trying to do. The simple way to load correctly 3ds models is to load models not a full map neither animation.
When simply loading a model you can get rid of all the transformations processes.

Also try to ensure you wind correctly your triangles and that the normals points at the good direction.

i have swapped the y and z normals coordinates and the part which wasn’t lighted is lighted now, except for the sides of the model which i display(the lighting is stange).
The orientation of the faces are apparently the same, therefore i don’t understand why it works for some parts of the mesh and not with the others.
if i multiply the normals of this mesh with the transformation matrix i have the same results (in this case).
i’m a little lost so if you can help me, i would be glad

One thing I’ve had to do when translating from my own “z is up” coordinate system to OpenGLs is to also swap the unitvectors.

E.g. to get from my rotation matrix’ X/Y/Z unit vectors+translation to an OpenGL transformation matrix, I had to do construct the OGL matrix like this:

 
             / Xx Zx Yx Tx \
OGL matrix = | Xz Zz Yz Tz |
             | Xy Zy Yy Ty |
        (    \ 1  1  1  1  /  )

X is X unit vector,
Y is Y unit vector,
Z is Z unit vector,
T is translation vector - keep at 0 when you convert a normal
{vector}x is x-coordinate of vector,
{vector}y is y-coordinate etc.
 

Maybe you need to do something similar when you convert your vertices and normals.

Hi,

i have tried with two implementations of normals computation and i have the same problem. I have displayed the normals and they are oriented to the bottom of the screen. I have the same result by multiplying normals by the matrix or not.
What can i do ?

PS : when i do the multiplication i just swap the y and z coordinates from the result, do you think it’s right ?

I’m not sure - my situation is different from yours. I keep my own data in “z is up” coordinates in memory, and only convert wen I issue my data to OpenGL.
(One concession I did make was this: my vector class has been declared with X,Z,Y coordinates in that order, so I can just pass my vectors to OpenGL in vertex arrays.)
And that works like a charm. But I don’t invert any coordinates.
At an early stage I have played around with an inverted coordinate, and (again when dealing with matrices) I found I had to invert the unit vector as well.

So you’d get something like:

  /  Xx  Zx -Yx \
  |  Xz  Zz -Yz |
  \ -Xy -Zy  Yy /

Yy would be inverted back.

But again, I’m not inverting right now, so I’m not sure how correct that was. It seemed to work.

I created a 3ds importer a while back + then
ditched it because the format is quite ugly to
read correctly…many of the docs are sketchy…
anyway, i suffered from things like weird suboject
tranformations & flipped normals on some
surfaces…you might want to consider another
format…or using a 3rd party lib…

sorry i cant be of more help…