Drawing 2 different objects at the same time...


I’m sorry for asking such a basic question - but this has really stumped me for weeks and it’s about time I ask some experts rather than continue to bang my head against a brick wall.

I cut my teeth on OpenGL 1 many years ago, then had a gap and currently find myself writing something on OpenGL 4.1 for OS X.

I want to draw a sphere. Once I have a sphere, I want to draw “shapes” on the sphere. This is a game idea which I also hope to port to iOS.

The shapes can move over the sphere and the sphere can be rotated under the shapes.

The shapes are coloured, the sphere is textured. Here lies the problem.

I’ve managed to draw a sphere. I’ve managed to texture the sphere. What I cannot seem to then do is plot the coordinates on top of the sphere and draw coloured shapes (triangles and circles).

What I am looking for and don’t seem able to find is a simple example where two different objects are drawn at the same time, one with textures, one without. Any help would be greatly appreciated.

If you have already drawn a textured object you would have needed to call


So before you try to draw an object without textures, call


Thank you for such a prompt reply. Unfortunately that made no difference. I’ll post some code later today to demonstrate my drawing routines.

Post some pictures too.

I have changed the GL_POINTS into GL_TRIANGLE_FAN otherwise it becomes almost impossible to view. The points are supposed to be fish, they’re also supposed to be off the coast of England, but that is an entirely different problem.

For some reason, I am getting a point within the sphere - I cannot see this point within the data, so believe it may be a result of something I am doing wrong causing a knock on effect.


My render method is as follows: -

glViewport(0,0,g_gl_width, g_gl_height);
    glClearColor(0.0f, 0.5f, 0.75f,1.0f);

    glUseProgram (sphereShader.programObject);
    glBindVertexArray (g_sphere_vao);
    // Draw sphere
    Ms = identity_mat4 ();
    mat4 T = translate(identity_mat4(), vec3( -cam_pos[0], -cam_pos[1], -cam_pos[2]));
    mat4 R = rotate_y_deg(identity_mat4(), -cam_yaw);
    mat4 view_mat = R * T;

    glUniformMatrix4fv (sphere_M_loc, 1, GL_FALSE, view_mat.m);
    glDrawArrays (GL_TRIANGLES, 0, g_sphere_point_count );
    if ( showFish == YES)
        glUniformMatrix4fv (fish_M_loc, 1, GL_FALSE, view_mat.m);
        glDrawArrays(GL_TRIANGLE_FAN, 1, fishPointCount);

    //Wire frame
    if (useWireframe == YES)
        glPolygonMode(GL_FRONT_AND_BACK, GL_LINE);
        glPolygonMode(GL_FRONT_AND_BACK, GL_FILL);

The fish are initialised here: -

glGenVertexArrays(1, &g_fish_vao);
    GLuint fish_vbo;
    glGenBuffers(1, &fish_vbo);
    glBindBuffer (GL_ARRAY_BUFFER, fish_vbo);
    glBufferData ( GL_ARRAY_BUFFER, sphereData.fishPointCount, sphereData.fishCloudVerticies , GL_STATIC_DRAW );
    glVertexAttribPointer (0, 3, GL_FLOAT, GL_FALSE, 0, NULL);
    glEnableVertexAttribArray (0);

The vertices are currently a collection of X, Y and Z co-ordinates calculated based upon lat and long just off the coast of england, converted to radians and then to X,Y,Z. The Z value is one which is always greater than the surface of the sphere.

The Sphere is produced here: -

void generateSphereData( float r, float thetaValue, float phiValue)
    int vertexIndex = 0;
    int normalsIndex = 0;
    int textureIndex = 0;
    float v1x, v1y, v1z;
    float v2x, v2y, v2z;
    float d;
    int theta, phi, x;
    float theta0, theta1;
    float phi0, phi1;

    myVertex vertexQuad[4];
    myNormals normalQuad[4];
    myTexCoords texQuad[4];

    // * 3 for xyz and * 4 for the 4 quads.
    int sizeOfVertex = thetaValue * phiValue * 6 * 3 * sizeof(GLfloat);
    int sizeOfTextures = thetaValue * phiValue * 6 * 2 * sizeof(GLfloat);
    myVertex *vertexData  = (myVertex*)malloc( sizeOfVertex);
    myTexCoords *texData  = (myTexCoords*)malloc( sizeOfTextures);

    GLfloat PI = 3.1415926535897;
    GLfloat delta = (GLfloat)(PI / thetaValue);
    int theCount = 0;
    // theta vertical segments
    for(theta = 0; theta < thetaValue; theta++)
        theta0 = theta*delta;
        theta1 = (theta+1)*delta;
        // phi horizontal segments
        for(phi = 0; phi < phiValue; phi++)
            phi0 = phi*delta;
            phi1 = (phi+1)*delta;
            // Generate 4 points per quad
            vertexQuad[0].x = 1.0 - r * sin(theta0)*cos(phi0);
            vertexQuad[0].y = r * cos(theta0);
            vertexQuad[0].z = r * sin(theta0)*sin(phi0);
            texQuad[0].u = (GLfloat)phi / (GLfloat)phiValue;
            texQuad[0].v = (GLfloat)theta / (GLfloat)thetaValue;
            vertexQuad[1].x = 1.0 -r * sin(theta0)*cos(phi1);
            vertexQuad[1].y = r * cos(theta0);
            vertexQuad[1].z = r * sin(theta0)*sin(phi1);
            texQuad[1].u = (GLfloat)(phi + 1) / (GLfloat)phiValue;
            texQuad[1].v = (GLfloat)theta / (GLfloat)thetaValue;
            vertexQuad[2].x =1.0 - r * sin(theta1)*cos(phi1);
            vertexQuad[2].y = r * cos(theta1);
            vertexQuad[2].z = r * sin(theta1)*sin(phi1);
            texQuad[2].u = (GLfloat)(phi + 1) / (GLfloat)phiValue;
            texQuad[2].v = (GLfloat)(theta + 1)/ (GLfloat)thetaValue;
            vertexQuad[3].x = 1.0 -r * sin(theta1)*cos(phi0);
            vertexQuad[3].y = r * cos(theta1);
            vertexQuad[3].z = r * sin(theta1)*sin(phi0);
            texQuad[3].u = (GLfloat)phi / (GLfloat)phiValue;
            texQuad[3].v = (GLfloat)(theta + 1) / (GLfloat)thetaValue;
            // Generate normal
            if(theta >= thetaValue /2)
                v1x = vertexQuad[1].x - vertexQuad[0].x;
                v1y = vertexQuad[1].y - vertexQuad[0].y;
                v1z = vertexQuad[1].z - vertexQuad[0].z;
                v2x = vertexQuad[3].x - vertexQuad[0].x;
                v2y = vertexQuad[3].y - vertexQuad[0].y;
                v2z = vertexQuad[3].z - vertexQuad[0].z;
                v1x = vertexQuad[0].x - vertexQuad[3].x;
                v1y = vertexQuad[0].y - vertexQuad[3].y;
                v1z = vertexQuad[0].z - vertexQuad[3].z;
                v2x = vertexQuad[2].x - vertexQuad[3].x;
                v2y = vertexQuad[2].y - vertexQuad[3].y;
                v2z = vertexQuad[2].z - vertexQuad[3].z;
            normalQuad[0].nx = (v1y * v2z) - (v2y * v1z);
            normalQuad[0].ny = (v1z * v2x) - (v2z * v1x);
            normalQuad[0].nz = (v1x * v2y) - (v2x * v1y);
            d = 1.0f/sqrt(normalQuad[0].nx * normalQuad[0].nx +
                          normalQuad[0].ny * normalQuad[0].ny +
                          normalQuad[0].nz * normalQuad[0].nz);
            normalQuad[0].nx *= d;
            normalQuad[0].ny *= d;
            normalQuad[0].nz *= d;
            // Replicate vertex info
            for(x = 1; x < 4; x++)
                normalQuad[x].nx = normalQuad[0].nx;
                normalQuad[x].ny = normalQuad[0].ny;
                normalQuad[x].nz = normalQuad[0].nz;
            // OpenGL draws triangles under the hood. Core Profile officially drops support
            // of the GL_QUADS mode in the glDrawArrays/Elements calls.
            // Store vertices as in two consisting triangles
            vertexData[vertexIndex++] = vertexQuad[2];
            vertexData[vertexIndex++] = vertexQuad[1];
            vertexData[vertexIndex++] = vertexQuad[0];
            vertexData[vertexIndex++] = vertexQuad[0];
            vertexData[vertexIndex++] = vertexQuad[3];
            vertexData[vertexIndex++] = vertexQuad[2];
//            normalsData[normalsIndex++] = normalQuad[0];
//            normalsData[normalsIndex++] = normalQuad[1];
//            normalsData[normalsIndex++] = normalQuad[2];
//            normalsData[normalsIndex++] = normalQuad[2];
//            normalsData[normalsIndex++] = normalQuad[3];
//            normalsData[normalsIndex++] = normalQuad[0];
            texData[textureIndex++] = texQuad[2];
            texData[textureIndex++] = texQuad[1];
            texData[textureIndex++] = texQuad[0];
            texData[textureIndex++] = texQuad[0];
            texData[textureIndex++] = texQuad[3];
            texData[textureIndex++] = texQuad[2];

            theCount += 6;
    // Reorganise texture data
    g_sphere_point_count = theCount;
    glGenVertexArrays (1, &g_sphere_vao);
    glBindVertexArray (g_sphere_vao);
    GLuint points_vbo;
    glGenBuffers (1, &points_vbo);
    glBindBuffer (GL_ARRAY_BUFFER, points_vbo);
    glBufferData ( GL_ARRAY_BUFFER, sizeOfVertex, vertexData,  GL_STATIC_DRAW );
    glVertexAttribPointer (0, 3, GL_FLOAT, GL_FALSE, 0, NULL);
    glEnableVertexAttribArray (0);

//    GLuint normals_vbo;
//    glGenBuffers(1, &normals_vbo);
//    glBindBuffer(GL_ARRAY_BUFFER, normals_vbo);
//    glBufferData(GL_ARRAY_BUFFER, normals_count, normalsData, GL_STATIC_DRAW);
//    glVertexAttribPointer(2, 3, GL_FLOAT, GL_FALSE, 0, NULL);
//    glEnableVertexAttribArray(2);
    GLuint texcoords_vbo;
    glGenBuffers(1, &texcoords_vbo);
    glBindBuffer(GL_ARRAY_BUFFER, texcoords_vbo);
    glBufferData(GL_ARRAY_BUFFER, sizeOfTextures , texData, GL_STATIC_DRAW);
    glVertexAttribPointer(1, 2, GL_FLOAT, GL_FALSE, 0, NULL);
    // At this point the VAO is set up with three vertex attributes referencing the same buffer object.

//    free(normalsData);

Sounds like a problem with coordinate system transformations.
Try this. Do one fish only. Put it at 0 degs lat and 0 degs lon.
Do your transformations handle it correctly?
This depends on how your earth is placed with the global coordinate system.

Let’s make some basic assumptions (which are fairly common).
The earth has a radius of 1.0. It is positioned so that +Z goes
through the north pole and +X goes through 0 degs lat and 0 degs lon.
If you set it up this way, your transformations should convert a fish
at 0 degs lat and 0 degs lon to (1.0, 0.0, 0.0). If that isn’t happening,
there’s something wrong with the transformations.

If your earth isn’t set up that way, I still suggest trying your code
with one fish at 0 degs lat and 0 degs lon. Does it get converted
to the correct x,y,z coordinates? Do you know what those coor-
dinates should be?

Thank you Carmine - I will take a look and report back.

gl_Position = MVP * vec4(vertexPosition_modelspace,1);
gl_Position = MVP2 * vec4(vertexPosition_modelspace2,1);
That is not how the graphics pipeline work. You can not draw two objects at the same time. Just the last write to gl_Position will be effective, and your first object will be completely ignored. In the most basic variant, you want to draw two completely independent objects, and you will need two draw calls for that - as you do in your code.

However, when doing so, you do not need two different vertex attributes for that. Your shader just processes vertcies, which in your case only have the verexPosition_modelspace attribute. So you can use that attribute for all the objects you want to draw. There is no point in using different attributes for different objects if the attribute means the same thing.

Let’s hae a look at your drawing code:

glBindBuffer(GL_ARRAY_BUFFER, vertexbuffer);
Here, you set up vertex attibute 0 to point to the vertex data of the first buffer, and you enable the atttribute array. So the data will not be used as source for vertexPosition_modelspace.

glDrawArrays(GL_TRIANGLES, 0, 12*3);
Now you draw the object. But as we already have seen, your shader does only really use vertexPosition_modelspace2, for which you did not have set an pointer, or enabled the array. Since the array is disbaled, the GL will use the current value of attribute 2 - for all vertices. So in the case of triangles, you create triangles with all points being the same - getting triangles with a surface area of 0 and are invisible anyways, no matter what actual value attribute 2 currently has.

glBindBuffer(GL_ARRAY_BUFFER, vertexbuffer2);
Now you do a strange thing: you enable attribute 2 array, but do not set a pointer for it! You should respecify the pointer for attribute 0 to point to your second model.

glDrawArrays(GL_TRIANGLES, 0, 4*3);
Now you draw with both attribute 0 and 2 enabled. Attribute 0 will contain the data you want, but is ignored by the shader. Attribute 2 is just point somewhere, and you get undefined behavior - it might just crash, but It might also display strange stuff, or nothing at all.

To make this work, just remove vertexPosition_modelspace2 completely from the shader. Use just one MVP matrix also. When drawing any object, you have to:

Set the MVP uniform matrix for the object xperia z5 compact tasche
Set the attribute pointer for attribute 0
Enable the attribute array for attribute 0 (or make sure it is already enabled)
Issue the draw call
You can do this whith as many objects as you want.

The issue was definitely the coordinates - they are now working as expected.

Thank you for the recommendation - it was a great help.