I recently rewrote my terrain renderer to use VBO’s instead of regular vertex arrays. I excpected to see significant improvements in performance, since the terrain model is quite big with a lot of vertex arrays.
I also did some experiments earlier with the Nvidia VAR/Fence extension and saw quite good performance improvements.
However, this time with VBO’s the performance actually got worse…
I tried it on two linux computers, one with an ASUS GeForce 6800 GT AGP card, with the 7174 drivers and one Dell Inspiron XPS 2, GeForce 6800 Ultra with 7667 drivers.
On the ASUS card I got more or less the same performance, but that computer is CPU bound anyway and on the benchmark I got about 15-20 fps.
On the XPS I got 60-80 fps without VBO, and 40-50 with VBO…
I haven’t tried it on windows yet, but generally the GL performance is the same in this application on windows and linux.
I have seen some threads on this topic earlier, but that was when this extension was brand new. I’d think that things should be different now.
So is this to be expected or am I doing something seriously wrong here?