Indexed Vertex Array error (GL_INVALID_ENUM)

I 've been trying to draw a mesh using Index Arrays. The mesh is a result mapping each pixel of a picture on 3D space (with z initially being 0) and triangulating CCW, starting from the upperleft corner, moving South and then North-East to complete the first triangle, and then South and West for the second. So GL_TRIANGLES must be my choice.

I’m using C#/Tao.
I have the following code:


try
            {
                int err = Gl.GL_NO_ERROR;
                //Fill the Arrays with any possible changes
                FillFaceArrays();
                
                //Activate the Arrays
                Gl.glEnableClientState(Gl.GL_VERTEX_ARRAY);
                Gl.glEnableClientState(Gl.GL_NORMAL_ARRAY);

                //Load the arrays 
                Gl.glVertexPointer(3, Gl.GL_DOUBLE, 0, Varray.ToRowWiseArray());
                Gl.glNormalPointer(Gl.GL_DOUBLE, 0, Narray.ToRowWiseArray());
                
                err = Gl.glGetError();
                if (err != Gl.GL_NO_ERROR) throw new Exception(Glu.gluErrorString(err));

                //Draw!!!
                Gl.glColor3ub(255, 0, 0);
                Gl.glDrawElements(Gl.GL_TRIANGLES, Iarray.Length, Gl.GL_INT, Iarray);               
                err = Gl.glGetError();
                if (err != Gl.GL_NO_ERROR) throw new Exception(Glu.gluErrorString(err));

                //Deactivate the Arrays
                Gl.glDisableClientState(Gl.GL_VERTEX_ARRAY);
                Gl.glDisableClientState(Gl.GL_NORMAL_ARRAY);

            }
            catch (Exception ex)
            {
                MessageBox.Show(ex.Message);
            }

FillFaceArrays() function fills the Varray, Narray and Iarray with the Vertex Normal and Index data respectively. Varray and Narray are of type double and Iarray is int.
After calling glDrawElements, I get a 1280:GL_INVALID_ENUM error which I cannot justify.

I thank you

The type parameter in glDrawElements should be GL_UNSIGNED_INT (or GL_UNSIGNED_BYTE, GL_UNSIGNED_SHORT). It can’t be GL_INT.

Thank you for the reply dletozeun. Indeed it worked :slight_smile: .
Since I’m new to OpenGL, can you please tell me why is that so? The Iarray was explicitly defined as (int[]) not (uint[]). So one would expect GL_INT to be the correct type.
Is there a rule to use unsigned types in OpenGL?

EDIT
Read the function definition. It clearly states that type should be GL_UNSIGNED_BYTE, GL_UNSIGNED_SHORT, or GL_UNSIGNED_INT… :o . Sorry.

But my “unsigned rule” question still holds.

No there are no general rules specific to unsigned types in Opengl. The best you can do is read the functions specs.

In this particular case, there is no need to allow signed types for index data since the use of negative indices is a non sense.

EDIT:

Allocate an array of unsigned int or short on the application side to prevent any weird signed -> unsigned data conversion. However, it should not be a problem since your indices are always positive, but with signed data you loose the half of the indexing space.