Vertex buffer with unsigned byte crash the app

Hello all,

I’m using OpenGL since 1.0 a long time ago, but last year I started to change my OGL 2.0 code to OGL 3.2.
I have an Athlon x2 4800+ and a GeForce 8600GT, with the lastest driver installed.

All seems to be working fine, fbo, vertex buffers, but there is only one strange thing that makes my app crash every time:

I have a stream vertex buffer that I upload to every frame the 2d interface information, in an easy cpu format to prevent a lot of conversions to float in my code,

typedef struct {
 unsigned int   posx, posy;
 unsigned short texCoordx, texCoordy;
 unsigned byte  color[4];

The vertex buffer is 2048 * sizeof( vertex2d ), and I have a loop to draw in blocks if the number of vertex exceeds this 2048 limit (what I consider very big buffer, cause I’m using a max of 100, 200 vertices for now)

I have a very simple vertex/pixel shader that I use that multiply the position by an ortho matrix, get the pixel in the texture and modulate with the vertex color.

I use to render:

// vbo = vertex buffer
// tex = texture for the widget
// kVertexIndex = 0
// kTexCoordIndex = 2
// kColorIndex = 3

glBindBuffer( GL_ARRAY_BUFFER, vbo );

//glEnableVertexAttribArray( kVertexIndex );
glVertexAttribPointer( kVertexIndex, 2, GL_UNSIGNED_INT, GL_FALSE, sizeof( vertex2d ), nullptr );

//glEnableVertexAttribArray( kColorIndex );
glVertexAttribPointer( kColorIndex, 4, GL_UNSIGNED_BYTE, GL_TRUE, sizeof( vertex2d ), ( void* )12 );

//glEnableVertexAttribArray( kTexCoordIndex );
//glActiveTexture( GL_TEXTURE0 );
glVertexAttribPointer( kTexCoordIndex, 2, GL_UNSIGNED_SHORT, GL_TRUE, sizeof( vertex2d ), ( void* )8 );

There is no index buffer enabled, none of this functions return error.
If I draw the vertices with glDrawArrays, the color is not modulated and everything works besides that (only the texture color is shown).
If I instead enable the vertices attribs, the kColorIndex call crash my app :confused:
The color attribute is used in the shader, the glGetAttribLocation return a valid value for it =
I think that I covered all the points that may be a problem or an wrong usage of the API, I don’t know where more I can look at…

Other think that makes me confuse is the glEnableVertexAttribArray(). The specs says that we need to enable and disable the indices, but I can render anything without calling this function, maybe the driver knows what indices are enable for a given shader anytime.

Any ideas?

Thank you

Normally, calls to glEnableVertexAttribArray happen after calls to glVertexAttrib*. Integer attributes need to be converted to floats in your vertex shader before they will be modulated. Also, you should read the documentation on glVertexAttribIPointer, which was designed for handling integers.

Thank you for the reply!
I think that glVertexAttribIPointer is only used if I don’t want the values converted to float, but in my case I want the Color to be converted to vec4 attribute, so I used the normal call with GL_TRUE in the normalize parameter, that appears to be valid based on the OpenGL 3.2 spec =(
but I’ll try it at home and see if it resolves the problem
Calling glEnableVertexAttribArray after or before the *Pointer have the same behavior in my source.

Thank you

That is bullshit. You can call glEnableVertexAttribArray before or after.

I said normally, not that it is required. That is what I saw in most samples, thus what I replicate in my code. I’m by no means an OpenGL expert (more like an intermediate), but I’ve been playing around with integer textures recently, so I gave my input.

If I instead enable the vertices attribs, the kColorIndex call crash my app

What do you mean by that? Do you mean that the actual glEnableVertexAttribArray(3) call crashes? Or is it the glVertexAttribPointer call that fails?

I said normally, not that it is required.

His point is that it is no more “normal” than doing it the other way. It’s simply what you’ve encountered. Though the swearing was unnecessary.


The last of the two crashes the app =
if I call glEnableVertexAttribArray and after that glVertexAttribPointer, the glVertexAttribPointer crash the app.
If I invert the order, the other crashes the app.

Are you sure “vbo” is a valid buffer object?

Yes, It’s a valid buffer object. The vbo id is valid and no error is reported by OpenGL in any function call =
If you all want, I can post the full source showing the issue when I go home.

Thank you