Smooth shading triangles

I have a problem with smooth shading when using glBegin(GL_TRIANGLES);

I am just drawing two triangles that make up a plane. The vertices
of both triangles are listed CCW, and I use glVertex3f and glNormal3f
for each vertex with the correct vertices and normals. This works
fine when doing flat shading.

However, when I use smooth shading, there is a hard edge between
the two triangles (i.e. the diagonal in the plane). Why aren’t
these two triangles being shaded together? It seems they are being
shaded by themselves.

When I copy the above plane into a vertex buffer and draw it
with glDrawElements, the entire plane is smooth shaded correctly.

Why does glDrawElements work and glBegin(GL_TRIANGLES) doesn’t?

I really need to use glBegin(GL_TRIANGLES) instead.

Are you using the same indices for the 2 vertices that the triangles share, in the index array?
If so, you might be the victim of a floating point error… if you use immediate mode, the vertices of both triangles will most likely be transformed and lit separately… that means, also the verts both triangles share, and their normals. Even if they have the same coordinates, when going through transformations, they might end up in only slightly different positions, and/or their normals might point into slightly different directions.
When using a vertex array, and especially glDrawElements(), 2 verts that have the same index (the shared verts of your 2 triangles) will only be transformed and lit once, and for rendering the second triangle, the previous results will be reused -> the same lit result color will be used for the shared verts in both triangles.

That’s the only explanation I can come up with…

No, my problem isn’t with glDrawElements.

It is with glBegin(GL_TRIANGLES).

I’ve seen many tyypical demos and examples that proudly show off
smooth shading by using a single triangle with the vertex corners
set at red, green, blue, which is drawn using GL_TRIANGLES.

But they never show a smooth shaded quad using GL_TRIANGLES.
I wonder why…because it doesn’t work??

i dont think you understood what Dodger said. when you draw something with vertex arrays, it only calculates the lighting for each vertex once, even if you use the vertex for more than one triangle. however, when you draw each triangle with glVertex3f(…), etc, each time you draw the vertex, it recalculates the color, so that when you draw the vertex for the first triangle, it gets one color, and when you draw the vertex again for the next triangle, it has to figure out what the color is again. since this is using floating point arithmetic, the value isnt precise, so each time you calculate the color for that vertex, you get a slightly different result. also, each time you calculate a translation for the vertex, you get a slightly different result, which means that even though the two verticies should be in the same place, they arent in exactly the same place (they will be close enough that you cant see that they are in a different place, but two different numbers go into the lighting calculations). this is why the same vertex is a slightly different color for two different triangles.

now, im not sure why you dont want to use vertex arrays, but if you want to draw the two triangles in imediate mode (calling glVertex3f, glNormal3f, etc), draw them in a triangle strip. this might also solve the problem.

“since this is using floating point arithmetic, the value isnt precise, so each time you calculate the color for that vertex, you get a slightly different result.”

That is not true. It may be using floating point arithmetic, but IEEE floating-point computations guarentee (as does the OpenGL specification) that if you send the binary identical vertex data it will come up with a binary identical result. There is error, but that error is not random error.

The problem likely is that when you draw the second triangle you aren’t using the same colors at the shared vertices as you are for the first triangle. Show us your drawing code.

ok, sorry, i should have said “may get a different result” which would be a result of a bug in whichever chip ends up doing the calculation.

I don’t think it has to with floating point errors.
And I’m not complaining that the colors are wrong or off.
The shading is being done with OpenGL lighting and vertex
normals, not vertex colors. I guess the example I gave
above was a bad comparison.

I think it has to do with OpenGL not knowing that these
two triangles need to be shaded together, not by themselves.

What I am trying to say, the shading does not cross the
edge where these two triangles meet:

glMatrixMode(GL_MODELVIEW);
glLoadIdentity();

FLOAT LightAmbient[4] = { 0,0,0,1 };
FLOAT LightDiffuse[4] = { 1,1,1,1 };
FLOAT LightPosition[4] = { 0,0,1,0 };

glEnable(GL_LIGHTING);
glEnable(GL_LIGHT1);

glLightfv( GL_LIGHT1, GL_AMBIENT, LightAmbient );
glLightfv( GL_LIGHT1, GL_DIFFUSE, LightDiffuse );
glLightfv( GL_LIGHT1, GL_POSITION, LightPosition );

glEnable( GL_LIGHTING );

///////////////

glShadeModel( GL_SMOOTH );

glBegin(GL_TRIANGLES);

glVertex3f( -1,  1, 0 );  glNormal3f( -0.666667f,  0.666667f, 0.333333f ); 
glVertex3f( -1, -1, 0 );  glNormal3f( -0.408248f, -0.408248f, 0.816497f );
glVertex3f(  1,  1, 0 );  glNormal3f(  0.408248f,  0.408248f, 0.816497f );

glVertex3f( -1, -1, 0 );  glNormal3f( -0.408248f, -0.408248f, 0.816497f );
glVertex3f(  1, -1, 0 );  glNormal3f(  0.666667f, -0.666667f, 0.333333f );
glVertex3f(  1,  1, 0 );  glNormal3f(  0.408248f,  0.408248f, 0.816497f );

// etc...

glEnd();

This is supposed to be one plane of a box facing in the z+ direction.
The normal values are vertex normals for each vertex and were pre-computed
beforehand.

If dump this data directly into a vertex buffer, it shades correctly.
But if I try to render it triangle by triangle like above, it doesn’t.

And, I really must use GL_TRIANGLES.

yeah, of course GL doesnt know that its the same vertex for two triangles. its because you havent “told” the computer to use the same vertex for two triangles. the way you do that in gl is by using vertex arrays, triangle strips, or triangle fans.

your main problem though, is that the normals arent correct. if both triangles are facing the same direction, all the verticies should have the exact same normal. if they dont, the lighting wont be correct. it looks like you are averaging the normals between the different faces of the cube, which is good if they are only at slightly different angles, but when they are 90 degrees from each other, youll get some weird lighting…

if both triangles are facing the same direction, all the
verticies should have the exact same normal.

No, the data is correct, as I have said before it renders
correctly in the vertex buffer. They are vertex normals, not
face normals.

All I want to know if it is possible to render a smooth
shaded object using GL_TRIANGLES. If not, I will forget it.

Originally posted by Syslock:
[b]
glBegin(GL_TRIANGLES);

glVertex3f( -1,  1, 0 );  glNormal3f( -0.666667f,  0.666667f, 0.333333f ); 
glVertex3f( -1, -1, 0 );  glNormal3f( -0.408248f, -0.408248f, 0.816497f );
glVertex3f(  1,  1, 0 );  glNormal3f(  0.408248f,  0.408248f, 0.816497f );
glVertex3f( -1, -1, 0 );  glNormal3f( -0.408248f, -0.408248f, 0.816497f );
glVertex3f(  1, -1, 0 );  glNormal3f(  0.666667f, -0.666667f, 0.333333f );
glVertex3f(  1,  1, 0 );  glNormal3f(  0.408248f,  0.408248f, 0.816497f );

[/b]

This is incorrect. You need to supply the normal before you supply the vertex.

Instead of
glVertex3f( -1, 1, 0 ); glNormal3f( -0.666667f, 0.666667f, 0.333333f );

you should have

glNormal3f( -0.666667f, 0.666667f, 0.333333f ); glVertex3f( -1, 1, 0 );

The reason this works with vertex arrays is because you just supply the pointers and OGL orders them itself. When you use glBegin, otherwise known as immediate mode you have to order the information. glVertex should always be last. Specify the color the normal and tex coord before you call glVertex.

Hey, that did the trick. It’s working.

Just a last question. You wouldn’t know if glEdgeFlag affects smooth
rendering or not?

[This message has been edited by Syslock (edited 12-04-2001).]

It should not. AFAIK glEdgeFlag just helps gl figure out which primitive edges to draw and not to draw.