Index Buffer Object crashes nvidia driver

I’m studying terrains right now using the book that I bought yesterday. Now, I converted the codes that are included on the book from immediate mode to vbo. Now, I heard that index buffer object is quite good so I try using it on my terrains. Now, I’m having a big problem, not on my codes but with driver. I don’t know if did something wrong on my codes but I did it correctly or maybe I don’t know, really. I’m really confuse right now because I think my engine is crying right now. lol.

Really, What the heck is wrong?

With IBO (using glDrawElements)

Without IBO (using glDrawArrays)

Without IBO (using glDrawElements)

Hi,
It seems that you did not specify index buffer size properly with

sizeof(indices) * m_iSize * m_iSize * 6

If indices is an array - unsigned short indices[], sizeof will give you the size of entire array in bytes, so you can just put sizeof(indices). If it’s a pointer - unsigned short* indices, sizeof will give you the size of the pointer, but you need sizeof(unsigned short) * m_iSize * m_iSize * 6. You should also check your vertex buffer size, it should be sizeof(Vertex) * m_iSize * m_iSize or something.

Here’s how I defined it.

int *indices;
vector3df *vertices;

Is it wrong? So how would I set it up correctly?

EDIT: Ok I found a solution. I just need to put an IBO on my md2 class to make it work. It’s a weird solution but it works. I’m just wondering why do I need to put an IBO also there? So does it mean that if I use IBO, I should use it all over.

int *indices;

Please note that sizeof(indices) == 8 on 64bit platforms, which will result in an out-of-range memory access.

What you want is sizeof(int) * num_indices or sizeof(indices[0]) * num_indices.


Here's how I defined it.

int *indices;
vector3df *vertices;

Is it wrong? So how would I set it up correctly?

You really need to learn how to use the operator sizeof (and may be pointers too). It is exactly the problem I pointed you out in your last threads about vbo.

I don’t get it really. I think I did it right.

I set it to sizeof(indices)m_iSizem_iSize6 and I know it is the same with sizeof(int)m_iSizem_iSize6… I try printing it using printf and it shows the same value. Can you please explain what’s the real problem with it? I don’t get it. I’m using a 64 bit vista by the way.

Thanks for the help.
Sarah22

Your index-buffer must be NumTriangles34 bytes big, if you later draw with GL_UNSIGNED_INT .

I set it to sizeof(indices)m_iSizem_iSize6 and I know it is the same with sizeof(int)m_iSizem_iSize6… I try printing it using printf and it shows the same value. Can you please explain what’s the real problem with it? I don’t get it. I’m using a 64 bit vista by the way.

Your are confusing something here. It is not because sizeof(indices) = sizeof(int) that what you are doing is correct.

glBufferData needs a size in bytes. So when you set the IBO size you need to get an index size in bytes and the number of indices.

If your indices are stored in integers (int) the size of an index is sizeof(int) and not sizeof(indices) because “indices” is an array of indices (i.e integers) and stores the address of its first element.
Thus, what you are doing when computing sizeof(indices) is getting the size of a memory address and not the size of an index in your IBO.

The problem is more obvious when you choose the type GLshort (2 bytes) to store your indices if it is enough. then sizeof(indices) != sizeof(GLshort).

Hope I am clear.