glDrawElements with Vertices, Normals, UVTexCoords

Hi,

I want to render some objects using OpenGL ES 2.0 and the glDrawElements() function. The Data is parsed from an .obj File in simple flot-arrays for vertices, normals, texCoords,


float *vertices;
float *normals;
float *texCoords;
GLushort *indicies;

where the arrays are setup in this structure…


vertices[0] = v1.x;
vertices[1] = v1.y;
vertices[2] = v1.z;

vertices[3] = v2.x;
vertices[4] = v2.y;
vertices[5] = v2.z;

and the indicies are

indicies[0] = vert1.x;
indicies[1] = tex1.x;
indicies[2] = norm1.x;

indicies[3] = vert1.y;
indicies[4] = tex1.y;
indicies[5] = norm1.y;

indicies[6] = vert1.z;
indicies[7] = tex1.z;
indicies[8] = norm1.z;


when i try to render the scene i get some problems with the normals and texCoords. I’m sure that there is a problem while filling the indicies-array. For rendering I’m using the following Code:

glBindFramebuffer(GL_FRAMEBUFFER, framebuffer);
glEnableVertexAttribArray(ATTRIB_VERTEX);
glEnableVertexAttribArray(ATTRIB_NORMAL);

glVertexAttribPointer(ATTRIB_VERTEX, 3, GL_FLOAT, 0, 0, vertices);
glVertexAttribPointer(ATTRIB_NORMAL, 3, GL_FLOAT, 0, 0, normals);
	
glDrawElements(GL_TRIANGLES, numFaces, GL_UNSIGNED_SHORT, indicies);

So my Question is:
How should I organize the indicies-array for drawing vertices, normals, texCoords correct with the glDrawElements function?

If I remember correctly, you can not specify separate index for each of the attributes ie


indicies[0] = vert1.x;
indicies[1] = tex1.x;
indicies[2] = norm1.x;

indicies[0] = vert1.x;
indicies[1] = tex1.x;
indicies[2] = norm1.x;

indicies[3] = vert1.y;
indicies[4] = tex1.y;
indicies[5] = norm1.y;

indicies[6] = vert1.z;
indicies[7] = tex1.z;
indicies[8] = norm1.z;
... won't map to glDrawElements

Instead GlDrawElements expects something like 
indicies[0] = vert1.x; // = tex1.x = norm1.x;
indicies[1] = vert1.y; // = tex1.y = norm1.y;
indicies[2] = vert1.z; // = tex1.z = norm1.z;

In other words each vertex, tex coordinate and normal share the SAME index.

I know its not openGL ES but you may still get benefit looking at songho examples, Section “glDrawElements()” and its code at vertexArray.Zip

Thanks for your advice.
Think that’s my fault - maybe i’ve to change my parser.

Ok next problem appears.
I’ve organised the indicies, but when I try to render a cube, the image looks flat - but the array-values are ok. I’ve tested another program, who draw the vertices with glBufferData, and there it works fine.

glBindFramebuffer(GL_FRAMEBUFFER, defaultFramebuffer);
glViewport(0, 0, backingWidth, backingHeight);
	
glEnable(GL_CULL_FACE);
glCullFace(GL_BACK); 

glClearColor(0.5f, 0.5f, 0.5f, 1.0f);
glClear(GL_COLOR_BUFFER_BIT);
    
glUseProgram(program);
		
glEnableVertexAttribArray(ATTRIB_VERTEX);
glEnableVertexAttribArray(ATTRIB_NORMAL);
glEnableVertexAttribArray(ATTRIB_TEXCOORD);
	
glVertexAttribPointer(ATTRIB_VERTEX, 3, GL_FLOAT, 0, sizeof(Vector), vertices);
glVertexAttribPointer(ATTRIB_NORMAL, 3, GL_FLOAT, 0, sizeof(Vector), normals);
glVertexAttribPointer(ATTRIB_NORMAL, 2, GL_FLOAT, 0, sizeof(TexCoords), texCoords);
	
glDrawElements(GL_TRIANGLES, 36, GL_UNSIGNED_SHORT, indicies);
	
glDisableVertexAttribArray(ATTRIB_VERTEX);
glDisableVertexAttribArray(ATTRIB_NORMAL);
glDisableVertexAttribArray(ATTRIB_TEXCOORD);

glBindRenderbuffer(GL_RENDERBUFFER, colorRenderbuffer);
[context presentRenderbuffer:GL_RENDERBUFFER];


cube.obj

What does your shader code look like (vertex shader and fragment shader)?

Also I notice you are culling the back face – are you careful in the winding direction you define the vertex indices – maybe just disable culling for a quick test. If you have wound your vertices in the “wrong” order then they should all show up once you disable culling. This will mean you need to reorder your indices

What is the actual value of sizeof(Vector)? etc