Problem with glDrawRangeElementsEXT

I tried to improve the performances of my terrain engine by using glDrawRangeElementsEXT but I have a problem.

The terrain is divised into sectors, and each sectors contains 1089 vertices (33*33)

the vertices are stored in a VBO sector by sector, and each sector contains an array of indices.

Each sector are rendered in the same way but some of them are not rendered well.

Here a picture:

the sectors which have problems use indices between 32768 to 65536, all sectors before and after that range works well.

I don’t know what happen for those value.

The more stranges is that if I use glDrawElements for rendering thoses sectors, it works, and if I use glDrawRangeElementsEXT with the end parameter to start + 50000 (for example) it works too. The right value is start + 1089.

I use unsigned int indices, my graphic card is a 6800GT and I use the latest nvidia drivers.

I search on google but that problem is not reported, I am wondering if someone can help me ??

im not sure what the problems is but (from the spec)

mode, count, type, and indices match the
corresponding arguments to DrawElements, with the additional constraint that all
values in the array indices must lie between start and end inclusive.
Implementations denote recommended maximum amounts of vertex and index
data, which may be queried by calling GetIntegerv with the symbolic constants
greater than the value of MAX ELEMENTS VERTICES, or if count is greater than

wont using such high numbers ie in excess of MAX ELEMENTS VERTICES u lose the benifit of rangelements.
also your VBO sounds like it has to many verts in it, i forget what ideal is but often 10x1000verts is faster than 1x10000

Each sector contains 3333=1089 vertices
My terrain contains 16
16 sectors
So my vbo contains 333316*16 vertices.

Some vertices are duplicated (at the edges of each sectors) in the goal to use DrawRangeElement. I am sure that each call to DrawRangeElements is correct because (end-start+1) = 1089 and my graphic card recommended value is 4096.

For the indices, there are 3232 triangles for each sector = 3232*6 indices = 6144. So I use unsigned int and 2 passes of 3072 indices because my graphics card max indices is 4096.

I am sure that my code is correct, but the rendering is not (for vertices from 32768 to 65536).

The sectors which use vertices from 0 to 32000 are correct and those which use vertices from 32000 to around 200000 are correct too. There are rendered exactly in the same way and all works well with DrawElements. or if I change the end value of DrawRangeElement to a very greater value like 50000 (I bet something around 32768) cause I tried 10000 and 20000 and its not correst but 20000 is better than 10000 (less holes)…

Most strange, it works fine if I do not use the VBO!

Is there any issue between DrawRangeElements and VBO?

Ok I found the solution!!! the bug was caused by a driver optimization :frowning:

I read the following in the NVidia VBO paper:

If the specified range can fit into a 16-bit integer,
the driver can optimize the format of indices to pass to the GPU.
It can turn a 32-bit integer format into a 16-bit integer format.
In this case, there is a gain of 2X.

I think there is a bug in the drivers and the optimization use a SHORT instead of and UNSIGNED SHORT or something like that…

So I did the optimisation myself. I call DrawRangeElements with UNSIGNED_SHORT when my indices are less than 65536 and with UNSIGNED_INT in the other case and it work fine!