VBO trouble

Yeah… I’m trying to get it work, but for some reason the line


causes the program to exit with an error message. I’ve checked that I’m not getting a null address from wglGetProcAddress, so that can’t be it. I’m running on win95 with the 43.45 drivers, and my graphics card is a geforce2 MX 400.

I think I might have defined the types GLsizeiptrARB and GLintptrARB wrong. I’m using Delphi and can’t find the bindings anywhere. I’ve tried longInt and cardinal, but neither seems to work.


I declared both as Integer – come to think of it, that’s actually wrong for GLsizeiptr, which should be Cardinal (unsigned). It still works, though. You could try my headers at www.delphi3d.net/dot if you’re not confident about your own.

– Tom

Thanks, Tom! It works now, altough I must admit that the real problem was that my context wasn’t active… So stupid, but it’s good you helped me get those types straight too.


I have drop perfomance if ELEMENT_ARRAY placed in STATIC_DRAW buffer on nVidia GeforceFX 5800 Ultra, det. 43.51
I think index process by driver by copying from video to system.

If i place index data to system memory, it works perfectly.

It’s bug or feature?

I’m guessing the driver has to parse the indices in order to find the min/max index counts. If the index buffer is in video memory, it has to read back from video memory, which is awfully slow.

I never tried, but maybe a function like glDrawRangeElements, to which you already give the min/max index counts, wouldn’t need to do that, hence performance increase, and not drop. Has anybody tested that?


I don’t know offhand why it would be slower. Are you keeping your element array in a separate buffer object?

Yes, i keep element array in separate buffer.

Even if i draw whole buffer or use glDrawRangeElements it’s very slow.

I think GPU have granular DMA and don’t need to compute min/max. Just read with stride if it needed.