I’ve created a test program with static data. The positions and normals of the two cubes next to each other (12 triangles, 8 vertices/normals each) are:
GLfloat vertices[] = {
1.00000f, -1.00000f, -1.00000f,
1.00000f, -1.00000f, 1.00000f,
-1.00000f, -1.00000f, 1.00000f,
-1.00000f, -1.00000f, -1.00000f,
1.00000f, 1.00000f, -1.00000f,
-1.00000f, 1.00000f, -1.00000f,
-1.00000f, 1.00000f, 1.00000f,
1.00000f, 1.00000f, 1.00000f,
// second cube vertices
-0.60000f, -0.60000f, 1.99787f,
-0.60000f, -0.60000f, 3.19787f,
-0.60000f, 0.60000f, 3.19787f,
-0.60000f, 0.60000f, 1.99787f,
0.60000f, 0.60000f, 1.99787f,
0.60000f, -0.60000f, 1.99787f,
0.60000f, 0.60000f, 3.19787f,
0.60000f, -0.60000f, 3.19787f
};
GLfloat normals[] = {
0.66667f, -0.66667f, -0.33333f,
0.40825f, -0.40825f, 0.81650f,
-0.66667f, -0.66667f, 0.33333f,
-0.40825f, -0.40825f, -0.81650f,
0.33333f, 0.66667f, -0.66667f,
-0.81650f, 0.40825f, -0.40825f,
-0.33333f, 0.66667f, 0.66667f,
0.81650f, 0.40825f, 0.40825f,
// second cube normals
-0.81650f, -0.40825f, -0.40825f,
-0.33333f, -0.66667f, 0.66667f,
-0.81650f, 0.40825f, 0.40825f,
-0.33333f, 0.66667f, -0.66667f,
0.81650f, 0.40825f, -0.40825f,
0.33333f, -0.66667f, -0.66667f,
0.33333f, 0.66667f, 0.66667f,
0.81650f, -0.40825f, 0.40825f
};
the indices array is this
GLuint indices[] = {
0, 1, 2,
0, 2, 3,
4, 5, 6,
4, 6, 7,
0, 4, 7,
0, 7, 1,
1, 7, 6,
1, 6, 2,
2, 6, 5,
2, 5, 3,
4, 0, 3,
4, 3, 5,
// second cube indices
0, 1, 2,
0, 2, 3,
3, 4, 5,
3, 5, 0,
4, 6, 7,
4, 7, 5,
1, 7, 6,
1, 6, 2,
1, 0, 5,
1, 5, 7,
6, 4, 3,
6, 3, 2
};
then when i draw i use
#define BUFFER_OFFSET(i) ((char *)NULL + (i))
// first cube
glDrawElements(GL_TRIANGLES, 36, GL_UNSIGNED_INT, 0);
// second smaller cube in front of it
glDrawElementsBaseVertex(GL_TRIANGLES, 36, GL_UNSIGNED_INT, BUFFER_OFFSET(sizeof(GLuint) * 36), 36);
and the second cube is rendered as a single triangle inside the first cube, not even the size/shape of one of the triangles of the second cube…
then i tried to change and the indices so that the array has already the second set of indices added with the base value
GLuint indices[] = {
0, 1, 2,
0, 2, 3,
4, 5, 6,
4, 6, 7,
0, 4, 7,
0, 7, 1,
1, 7, 6,
1, 6, 2,
2, 6, 5,
2, 5, 3,
4, 0, 3,
4, 3, 5,
// second cube indices with added base value
36 + 0, 36 + 1, 36 + 2,
36 + 0, 36 + 2, 36 + 3,
36 + 3, 36 + 4, 36 + 5,
36 + 3, 36 + 5, 36 + 0,
36 + 4, 36 + 6, 36 + 7,
36 + 4, 36 + 7, 36 + 5,
36 + 1, 36 + 7, 36 + 6,
36 + 1, 36 + 6, 36 + 2,
36 + 1, 36 + 0, 36 + 5,
36 + 1, 36 + 5, 36 + 7,
36 + 6, 36 + 4, 36 + 3,
36 + 6, 36 + 3, 36 + 2
};
then i changed the draw call to
// first cube
glDrawElements(GL_TRIANGLES, 36, GL_UNSIGNED_INT, 0);
// second smaller cube in front of it
glDrawElements(GL_TRIANGLES, 36, GL_UNSIGNED_INT, BUFFER_OFFSET(sizeof(GLuint) * 36));
but in this test application continues to render that wrong triangle…
If i remove the data of the first cube and just use one glDrawElements the second cube is ok, so it’s not wrong data :tired:
I can provide the full source code if needed (i’ve simplified everything to the bare minimum). I’m using GLEW + a Radeon card with 14.4 drivers, OpenGL 3.3